Compare commits

..

71 Commits

Author SHA1 Message Date
Gerard Marull-Paretas
52e06334fb ci: manifest: do not persist git credentials
With this setting enabled, Git credentials are not kept after checkout.
Credentials are not necessary after the checkout step, since we do not
do any further manual push/pull operations.

Signed-off-by: Gerard Marull-Paretas <gerard.marull@nordicsemi.no>
2022-02-25 07:18:16 -05:00
Anas Nashif
8be935fe5c doc: 2.5.1 release notes
Add release notes placeholder for 2.5.1 release.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2021-08-03 15:32:18 -04:00
Anas Nashif
37139f93d9 release: Bump release to 2.5.1-rc1
Increase version to 2.5.1-rc1.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2021-08-03 15:32:18 -04:00
Daniel Leung
3b369ee0c1 tests: schedule_api: lengthen interval for slicing reset test
When calculating the expected interval for threads other than
the first one, the test uses ms->ticks->cycles conversion to
figure out the bound of cycles permitted. Both lower and upper
bound conversions are using the k_*_to_*_floor32(). When
numbers involved are not wholly divisible, decimal points are
being truncated, resulting in incorrect intervals, and thus
failing tests. So change the calculation to appropriate
floor() or ceil() based on the boundary.

Signed-off-by: Daniel Leung <daniel.leung@intel.com>
2021-07-20 20:05:00 -04:00
Vinayak Kariappa Chettimada
82db759ce0 Bluetooth: controller: Fix missing encryption procedure state check
Fix a missing encryption procedure state check which allowed
out of order receive of START_ENC_RSP PDU, which made the
controller to believe its already in an encryption procedure
in progress state.

Fixes #34392.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-07-17 08:11:24 -04:00
Joakim Andersson
5a430eea41 Bluetooth: host: Fix L2CAP sent callback on disconnected channel
When receiving the L2CAP sent callbacks the dynamic L2CAP channel may
have been disconnected already. The user of the dynamic channel should
have received the disconnected and released callbacks for this channel
to release any resources for the data being sent, so simply ignoring
this sent callback is enough.

Fix sent callbacks by providing the CID to the callback instead of a
pointer to potentially released memory, and lookup the CID to check that
it is still valid.

Signed-off-by: Joakim Andersson <joakim.andersson@nordicsemi.no>
2021-06-29 13:27:03 -04:00
Joakim Andersson
79d575bf35 Bluetooth: controller: Check length field of scan response data
Check the length field of the scan response data.

Signed-off-by: Joakim Andersson <joakim.andersson@nordicsemi.no>
2021-06-11 16:57:07 +02:00
Dong Wang
dabf237584 interrupt_controller: ioapic: support more device power states
do nothing when enter or resume from DEVICE_PM_LOW_POWER_STATE

Signed-off-by: Dong Wang <dong.d.wang@intel.com>
Signed-off-by: Flavio Ceolin <flavio.ceolin@intel.com>
2021-05-14 16:00:43 -04:00
Martí Bolívar
4e8b382e03 soc: nrf: fix NRF_DT_CHECK_GPIO_CTLR_IS_SOC
This was merged by mistake without being tested and is not working
properly. We need to avoid doing a BUILD_ASSERT() when the relevant
property is missing, because we can't use DT_GPIO_CTLR() on an
undefined property. Handle this with COND_CODE_1().

Signed-off-by: Martí Bolívar <marti.bolivar@nordicsemi.no>
2021-04-22 17:17:08 -04:00
Luiz Augusto von Dentz
b02636053a Bluetooth: ATT: Add documentation for chan_send
chan_send does restore buffer state in case of an error which is
different than how bt_l2cap_send_cb works as it does always unref in
case of an error.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2021-04-22 17:16:49 -04:00
Luiz Augusto von Dentz
66f77f9d92 Bluetooth: ATT: Fix crash if bt_l2cap_send_cb fails
This fixes a regression introduced by
10841b9a14 as it did remove a call to
net_buf_ref which was used not only to keep a reference for resending
but also to prevent bt_l2cap_send_cb to unref the buffer in case it
fails.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2021-04-22 17:16:49 -04:00
Vinayak Kariappa Chettimada
9e3c7b995f Bluetooth: controller: Fix flash driver sync regression
Fix flash driver sync regression introduced due to changed
default ULL_HIGH in the commit 30634334a8 ("Bluetooth:
controller: Fix ULL_HIGH priority to be lower than LLL").

Fixes #33862.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-13 19:53:20 +02:00
Vinayak Kariappa Chettimada
f642feba06 Bluetooth: controller: Increased thread context operation queue count
When ticker job is disabled inside radio events then all
advertising, scanning, and slave latency cancel ticker
operations will be deferred, requiring increased ticker
thread context operation queue count.

Relates to #32430.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-13 19:53:20 +02:00
Vinayak Kariappa Chettimada
f9dd4c24e5 Bluetooth: controller: Facilitate reuse of BT_CTLR_MAX_CONNECTABLE
Minor rearrange order of definition BT_CTLR_MAX_CONNECTABLE
to better reuse it.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-13 19:53:20 +02:00
Vinayak Kariappa Chettimada
fc47320d93 Bluetooth: controller: Consistent use of internal BT_CTLR_ADV_SET
Consistently use the internal BT_CTLR_ADV_SET value in the
controller's implementation.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-13 19:53:20 +02:00
Kumar Gala
ec0aa8331a drivers: uart: uart_cmsdk_apb: fix interrupt handling
The CMSDK uart interrupts for TX and RX can either be treated as a
signel interrupt line or distinct interrupts for TX & RX.  In the case
that they were distinct we didn't get the ifdef correct based on DTS.

If we have 2 interrupts in DTS we assume they are for TX & RX and thus
build the interrupt support for distinct TX & RX ISRs.

Also, cleanup handling of UART_2..UART_4 to be similar to how
UART_0/UART_1 code is using DT_INST_IRQN(x).

Fixes #30770
Fixes #25601

Signed-off-by: Kumar Gala <kumar.gala@linaro.org>
2021-04-08 11:15:29 -05:00
Gustavo Romero
92e005122a tests: Add test to check uart_irq_is_pending
Add test to check if uart_irq_is_pending() correctly returns 0 (meaning
there are no more RX and TX pending interrupts) when all RX and TX data
is processed.

Signed-off-by: Gustavo Romero <gustavo.romero@linaro.org>
2021-04-07 12:04:47 -05:00
Gustavo Romero
bd0fe17fa0 drivers: uart: uart_cmsdk_apb: Fix uart_irq_is_pending
Currently CMSDK uart_irq_is_pending does not use RX and TX interrupt
bits found in INTSTATUS register to check for pending interrutps but
rather it checks for pending interrupts indirectly by checking if RX and
TX buffers are, respectively, full and empty, i.e. it checks bits 0 and
1 in STATE register instead of bits meant for interrupt status found in
INTSTATUS register.

That is particularly problematic because although a RX interrupt implies
a RX buffer full and a TX interrupt implies a TX buffer empty, the
converse is not true. For instance, a TX buffer might be empty for all
data was processed (sent to serial line) already and no further data was
pushed into TX buffer so it remained empty, without generating any
additional TX interrupt. In that case the current uart_irq_is_pending
implementation reports that there are pending interrupts because of the
following logic:

/* Return true if rx buffer full or tx buffer empty */
return (UART_STRUCT(dev)->state & (UART_RX_BF | UART_TX_BF))
                                != UART_TX_BF;

which will return 1 (true) if STATE[0] = 0 (TX buffer is empty), since
UART_TX_BF = 1, so STATE[0] != UART_TX_BF, which is true (assuming here
for the sake of simplicity that UART_RX_BF = 0, i.e. RX buffer is empty
too).

One of the common uses of uart_irq_is_pending is in ISR in contructs
like the following:

while (uart_irq_update(dev) && uart_irq_is_pending(dev)) {
  if (uart_irq_rx_ready(dev) == 0) { // RX buffer is empty
    continue;
  }
  // RX buffer is full, process RX data
}

So the ISR can be called due to a RX interrupt. Upon finishing
processing the RX data uart_irq_is_pending is called to check for any
pending IRQs and if it happens that TX buffer is empty (like in the case
that TX interrupt is totally disabled) execution gets stuck in the while
loop because TX buffer will never transition to full again, i.e. it will
never have a chance to have STATE[0] = 1, so STATE[0] != UART_TX_BF is
always true.

This commit fixes that undesirable and problematic behavior by making
uart_irq_is_pending use the proper bits in the interrupt status register
(INTSTATUS) to determine if there is indeed any pending interrupts.

That, on the other hand, requires that the pending interrupt flags are
not clearly automatically when calling the ISR, otherwise
uart_irq_is_pending() will return immediatly false on the first call
without any data being really processed inside the ISR. Thus, because
both RX and TX buffer (FIFO) are only 1 byte long, that commit clears
the proper interrupts flags precisely when data is processed (fifo_read
and fifo_fill) or when the interrupts are disabled (irq_rx_disable and
irq_tx_disable).

Finally, that commits also takes the chance to update some comments,
specially regarding the need to "prime" when enabling the TX interrupts
(in uart_cmsdk_apb_irq_tx_enable()). The need to "prime" can be verified
in the CMSDK UART reference implementation in Verilog mentioned in the
"Arm Cortex-M System Design Kit" [0], on p. 4-8, section 4.3,
in cmsdk_apb_uart.v. In that implementation it's also possible to verify
that the FIFO is only 1 byte long, justifying the semantics that if
buffers are not full (STATE[0] or STATE[1] = 0) they are _completly_
empty, holding no data at all.

[0] https://documentation-service.arm.com/static/5e8f1c777100066a414f770b

Signed-off-by: Gustavo Romero <gustavo.romero@linaro.org>
2021-04-07 12:04:47 -05:00
Gustavo Romero
fe3f71bdad boards: arm: mps2-an521: Fix DT memory regions
Currently RAM region specified in the DT for board mps2-an512 to store
data (not to run code) is set to start at 0x3000_0000 and a 16M
contiguous space is assumed. However, at that address there is no such
contiguous space of 16M, rather only a 128K area is available. As a
consequence large applications linked with Zephyr might end up using
memory regions that are not valid, specially at runtime when the stack
grows, causing a BusFault.

Application Note 512 only specifies a 16M contiguous space available
starting at 0x8000_0000 (please see 'Table 3-4: SSRAM2 and SSRAM3
address mapping' and 'Table 3-6: External PSRAM mapping to Code Memory',
on pages 3-7 and 3-8, respectively), which resides in the PSRAM
(external RAM).

The AN521 also specifies a 4M contiguous space available starting at
0x3800_0000 which can be used as RAM for data storage and which is not
currently described in the DT.

The current DT also defines a 224M flash region (to run code) which
doesn't effectively exist, because most of it is reserved (~148M).

That commit fixes the incorrect definition of region 0x3000_0000 (16M)
and hence defines a new region called 'sram2_3' that maps to region
0x3800_0000 (4M) which is used as RAM to store data, and fixes the flash
region defining a new region 'sram1' (4M) from where code is executed
(starting at 0x1000_0000). The board has no real flash memory, rather an
auxilary HW populates the appropriate memory regions from images found
in a MicroSD card.

That commit also defines the missing PSRAM (16M) region ('psram') which
can be used by large programs as a general purpose RAM.

Finally, it also fixes the DT for the non-secure memory regions to
reflect the fixes described above for the secure memory regions.

Signed-off-by: Gustavo Romero <gustavo.romero@linaro.org>
2021-04-07 12:03:48 -05:00
Vinayak Kariappa Chettimada
14bd22db75 Bluetooth: controller: Fix race in create connection cancel
Fix race conditional when create connection cancel is called
and actually a connection did get setup while initiator is
being stopped.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
d72a1078b2 Bluetooth: controller: Conditional compile adv addr used by initiator
Conditional compile adv_addr which is only used by
controller when supporting initiator/central role.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
8da72d16b7 Bluetooth: controller: Move initiated flag into role specific union
Move the initiated bit flag into role specific union, so that
other role specific bit flags can share bit space in the LLL
context structure.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
1e320a352f Bluetooth: controller: Fix PHY Update procedure checks
Fix PHY update procedure to check transaction violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
6eb998bd78 Bluetooth: controller: Fix ping procedure checks
Fix ping procedure to check transaction violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
26e335cc42 Bluetooth: controller: Fix data length update procedure checks
Fix data legnth update procedure to check transaction
violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
871ab25c2f Bluetooth: controller: Fix reject extended indication checks
Fix reject extended indication PDU dispatch to check
transaction violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
cd0ed04627 Bluetooth: controller: Fix unknown control PDU checks
Fix unknown control procedure handling to check transaction
violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
2f580e1930 Bluetooth: controller: Fix Feature Exchange checks
Fix Feature Exchange procedure to check transaction
violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
651afd7c39 Bluetooth: controller: Add control procedure lock
Add control procedure lock flag to detection procedure
transaction violations.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
ef7cbbfb60 Bluetooth: controller: Fix interval check in CONNECT_IND PDU
Check for interval value in received CONNECT_IND PDU and
ignore connection setup.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
d87cc6f51a Bluetooth: controller: Fix channel map check in CONNECT_IND PDU
Fix the leak of node rx buffer used to generate the
connection complete and CSA#2 event introduced in the
commit 4a5f263e5a ("Bluetooth: controller: split: Validate
chan map and hop value") and the
commit 94d5f0854e ("Bluetooth: controller: fixing error
re. all zero chmap in conn-ind").

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
860bb75a69 Bluetooth: controller: Fix dev assert in CPR implementation
Replace a development assertion in the implementation of
Connection Parameter Request Procedure with an internal
comment and handle transaction violation be ignoring the
PDU.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Vinayak Kariappa Chettimada
ff3659ca51 Bluetooth: controller: Remove redundant connection initiated check
Remove the redundant connection initiated check as the event
is closed on connection initiated and it is sufficient to
check in the prepare_cb function to abort any events in the
pipeline after the connection has been initiated.

Relates to commit 5ce5dc055e ("Bluetooth: controller:
Avoid race between ULL and LLL when initiating conn") and
commit 18f5fb99c1 ("Bluetooth: controller: Remove use of
lll_stop").

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-04-06 09:56:38 -04:00
Wolfgang Puffitsch
c8d190e318 Bluetooth: controller: Remove use of lll_stop
Remove use of lll_stop and lll_is_stop and rely on "initiated" flag in
lll_conn struct instead.

Signed-off-by: Wolfgang Puffitsch <wopu@demant.com>
2021-04-06 09:56:38 -04:00
Wolfgang Puffitsch
0dfa18307e Bluetooth: controller: Avoid race between ULL and LLL when initiating conn
Use an "initiated" flag in the lll_conn struct to guard the processing
of PDUs related to connection initiation (CONNECT_IND,
AUX_CONNECT_RSP). This avoids races between ULL and LLL when creating
a connection.

Signed-off-by: Wolfgang Puffitsch <wopu@demant.com>
2021-04-06 09:56:38 -04:00
Eugeniy Paltsev
6c2ae13e9a linker-defs: Fix sorting order of objects by priority
Commit 0a7b65e tweaked the CREATE_OBJ_LEVEL macro in such a way
that it would break the expected sorting order.

For example if you had 2, 19, 20, 30 as the level, we'd end up sort
these to be 19, 2, 20, 30.

Fix this by adding aditional "_" symbol after the init level counter.
That allows to keep correct sort order (for both GNU and MWDT
toolchains) and distinguish init level counter from section suffix
(for MWDT toolchain).

Fixes zephyrproject-rtos#33464

Signed-off-by: Eugeniy Paltsev <Eugeniy.Paltsev@synopsys.com>
2021-03-25 22:28:53 -04:00
Andriy Gelman
c418c9aecf drivers: can: Fix sample point calculation
CAN_SYNC_SEG and ts1 are in common units. Both need to be scaled by 1000
to calculate the sample point.

Signed-off-by: Andriy Gelman <andriy.gelman@gmail.com>
2021-03-22 16:12:31 -04:00
Johan Hedberg
61682bd6d9 Bluetooth: L2CAP: Fix missing buffer length check for sdu_len
We should verify that the buffer has sufficient data before attempting
to parse the SDU length field. If we get a too short packet just
disconnect the channel.

Fixes #32497

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2021-03-22 13:09:58 -04:00
Vinayak Kariappa Chettimada
0249a62662 Bluetooth: controller: Fix regression in ctrl tx queue handling
Fix control Tx buffer leak into data Tx pool that happens
after a cross-over control procedure response was paused due
to currently active encryption setup procedure, and a new
control Tx PDU in addition to the paused one is enqueued
thereafter.

When the control tx PDUs is resumed but not yet enqueued
towards the radio, if there is a new control Tx PDU enqueued
then the paused control Tx PDU is not set as the head of the
control PDUs in the Tx queue. This caused the paused control
Tx PDU to be associated with data Tx pool, hence causing the
incorrect release into data Tx pool.

Relates to the commit bff76b4cce ("Bluetooth: controller:
split: Fix control tx queue handling") and to the
commit 6991d09977 ("Bluetooth: controller: Fix control tx
queue handling").

Fixes #32898.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-03-19 07:33:19 -04:00
Luiz Augusto von Dentz
550939e441 Bluetooth: L2CAP: Fix not including all DCIDs
The order of Destination CIDs shall correspond the order of Source CIDs
including its amount so errors that don't result in all connection being
refused shall not break the order of CIDs.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2021-03-18 08:39:31 -04:00
Luiz Augusto von Dentz
56c712d2ca Bluetooth: L2CAP: Fix invalid BT_L2CAP_ECRED_CONN_RSP
For errors that means all connections have been refused there is no need
to add dcids since none will be valid.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2021-03-18 08:39:31 -04:00
Luiz Augusto von Dentz
ab2476d88e Bluetooth: L2CAP: Fix not checking for L2CAP_ECRED_CHAN_MAX
When receiving L2CAP_CREDIT_BASED_CONNECTION_REQ the remote may request
more channels than allowed so this checks if amount of channel surpasses
the maximum channels (5) and return an error.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2021-03-18 08:39:31 -04:00
Joakim Andersson
1626b87d92 Bluetooth: host: Overwrite existing bond when IRK has been updated
Overwrite the existing bond when the IRK of the existing bond could not
resolve the RPA of the peer. This would happen if the peer has deleted
the bond and replaced the IRK that was used.

Signed-off-by: Joakim Andersson <joakim.andersson@nordicsemi.no>
2021-03-18 08:39:19 -04:00
Joakim Andersson
500c4f2cd3 Bluetooth: host: Refactor update_keys_check to operate on keys as input
Refactor update_keys_check helper function to operate on input keys
input. This allows the function to be re-used on a keys structure that
is not the current connection keys.

This also avoids the helper function changing the connection state.
The conn->le.keys pointer should at this point always have been
assigned, as central when sending the pairing request, and as peripheral
when receiving the pairing request at the very latest.

Signed-off-by: Joakim Andersson <joakim.andersson@nordicsemi.no>
2021-03-18 08:39:19 -04:00
Marcin Niestroj
0fa47361c8 net: offload: fix device and driver names generated by DT device macros
Generated device and driver names have changed with commits [1] and [2],
but net offload drivers missed the conversion. Update net offload DT
device macros now.

[1] commit 8c1bef535b ("device: support generating defines from
  devicetree nodes with no label")
[2] commit f91e9fba51 ("device: fix potential truncation of DT-derived
  device names")

Signed-off-by: Marcin Niestroj <m.niestroj@grinn-global.com>
2021-03-16 08:19:03 -04:00
Erwan Gouriou
6cf2fdc5d1 samples/driver: watchdog: wwdg sample should run with west
A different configuration is applied when running this test
using west or using twister.
Both will use the disco_l475_iot1 overlay, but the extra config
for clock bus tunning is applied only when using twister and
samples is failed when run with west.
Move this extra config from sample.yaml to board .conf file so
it is applied in both cases.

Fixes #32376

Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-24 09:05:57 -05:00
Jukka Rissanen
343a26d1ae net: mgmt: Use proper coop thread priority value
If user sets CONFIG_NUM_PREEMPT_PRIORITIES=0, then the priority
of the net_mgmt thread will be -1 which is the same as idle thread.
This will trigger assert in kernel as then the minimum coop priority
is -2 in this case. Remove the net_mgmt thread priority setting from
Kconfig file as it is low value and set the coop thread priority
the same way as other network threads are doing it.

Fixes #32375

Signed-off-by: Jukka Rissanen <jukka.rissanen@linux.intel.com>
2021-02-24 09:05:25 -05:00
Erwan Gouriou
7179e1d6a9 boards: stm32f1: Remove useless CONFIG_CLOCK_STM32_PLL_XTPRE=n
CLOCK_STM32_PLL_XTPRE Kconfig symbol default value is n.
Then there is no need to explicitly set it to 'n' in stm32f1 boards



Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-19 09:42:33 -05:00
Erwan Gouriou
664df1e75c drivers/clock_control: stm32f1: Reinstanciate CLOCK_STM32_PLL_XTPRE
This reverts commit "drivers/clock_control: Remove useless
CLOCK_STM32_PLL_XTPRE config" 9be1f7e22f.

Fixes #32382

Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-19 09:42:33 -05:00
Alexander Wachter
cb97dd13ff ztest: fix z_assert_within() bounds
zassert_within should also compare on equality instead
of only greater/lower.
example:
zassert_within(1,1,0); // should return true
zassert_within(1,2,1); // should return true

Signed-off-by: Alexander Wachter <alexander@wachter.cloud>
2021-02-19 07:10:39 -05:00
Andy Ross
0e64b47a5d tests/timer_api: Correct precision and fix correctness mistakes
Correct a bunch of precision/analysis errors in this test:

* Test items weren't consistent about tick alignment and resetting of
  the timestamp, so put these steps into init_timer_data() and call
  that immediately before k_timer_start().

* Many items would calculate the initial timestamp AFTER
  k_timer_start(), leading to an extra (third!) point where the timer
  computation could alias by an extra tick.  Always do this
  consistently before the timer is started (via init_timer-data()).

* Tickless systems with high tick rates can easily advance the system
  uptime while the timer ISR is running, so the system can't expect
  perfect accuracy even there (this test was originally written for
  ticked systmes where the ISR was by definition happening "at the
  same time").

  (Unfortunately our most popular high tick rate tickless system,
  nRF5, also has a clock that doesn't divide milliseconds exactly, so
  it had a special path through all these precision comparisons and
  avoided the bugs.  We finally found it on a x86 HPET system with 10
  kHz ticks.)

* The interval validation was placing a minimum bound on the interval
  time but not a maximum (this mistake was what had hidden the failure
  to reset the timestamp mentioned above).

Longer term, the millisecond precision math in these tests is at this
point an out of control complexity explosion.  We should look at
reworking the core OS tests of k_timer to use tick precision (which is
by definition exact) pervasively and leave the millisecond stuff to a
separate layer testing the alternative/legacy APIs.

Fixes #31964 (probably -- that was reported against up_squared, on
which I had trouble reproducing, but it was a common failure on
ehl_crb).

Signed-off-by: Andy Ross <andrew.j.ross@intel.com>
2021-02-19 07:10:20 -05:00
Jukka Rissanen
18ddabc335 net: tls: Allow access to TLS socket in userspace
If userspace is enabled, then the TLS context needs to be
made a NET_SOCKET kernel object. Without this the userspace
cannot access TLS sockets.

Signed-off-by: Jukka Rissanen <jukka.rissanen@linux.intel.com>
2021-02-19 07:10:08 -05:00
Peter Bigot
c6921d2026 drivers/i2c: stm32_v1: remove unused variable
A recent patch removed use of the cfg structure, but left a pointer to
it defined which causes build failures.

Signed-off-by: Peter Bigot <peter.bigot@nordicsemi.no>
2021-02-19 07:09:56 -05:00
Erwan Gouriou
4c46e0072d drivers/i2c: stm32_v1: TX IRQ enable called twice in a row
On I2C V1 parts, LL_I2C_EnableIT_TX() translates to EVT and BUF
IRQ enabling.
In stm32_i2c_msg_write function, LL_I2C_EnableIT_TX is called right
after stm32_i2c_enable_transfer_interrupts which already enables BUF
IRQ, which starts the transfer.
As a consequence it could happen that transfer is already complete
at the time LL_I2C_EnableIT_TX is called. This case is not expected
by remaining part of the code which loops forever waiting for BUF IRQ
to be raised.
Remove the superfluous LL_I2C_EnableIT_TX call.

Fixes: #32265

Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-19 07:09:36 -05:00
Andrzej Puzdrowski
dd0f451c23 drivers/flash/soc_flash_nrf: suspend POFWARN before engage
nRF52840, nRF52810, nRF52811 and nRF52805 are affected by anomaly 242.
This patch introduces workaround for this anomaly as follow:
Power-fail comparator is disabled before any attempt to erase or write.
Either erase or write is not proceed if EVENT_POFWARN is
already asserted.

Signed-off-by: Andrzej Puzdrowski <andrzej.puzdrowski@nordicsemi.no>
2021-02-19 07:09:25 -05:00
Andrzej Puzdrowski
eb3d194475 drivers/flash/soc_flash_nrf: support in-progress abort
Introduce support for situation when synchronization back-end
aborts operation before it is done. synchronization API will
transfer operation return code to the driver shim back.

Additionally:
FLASH_OP_ONGOING value was switched to be positive in order to
not been mislead with a negative error code.

Signed-off-by: Andrzej Puzdrowski <andrzej.puzdrowski@nordicsemi.no>
2021-02-19 07:09:25 -05:00
Jose Alberto Meza
b4b2903e5a samples: drivers: espi: Showcase OOB Rx asynchronous handling
Add example to handle OOB host responses asynchronously.

Signed-off-by: Jose Alberto Meza <jose.a.meza.arellano@intel.com>
2021-02-19 07:09:07 -05:00
Jose Alberto Meza
dfabe3ed34 samples: drivers: espi: Remove magic number
Define macro for number of times the temperatures will
be retrieved.

Signed-off-by: Jose Alberto Meza <jose.a.meza.arellano@intel.com>
2021-02-19 07:09:07 -05:00
Jose Alberto Meza
acb8a1f0cf drivers: espi: xec: Add support for host-initiated traffic
Keep OOB Rx channel and interrupt always enabled.

Send callback when packet is received in OOB Rx channel
if asynchronous host-initiated handling is enabled.

Note that driver doesn't perform any buffering from packets,
so access to OOB Rx channel is gated by client's driver
packet retrieval.

Signed-off-by: Jose Alberto Meza <jose.a.meza.arellano@intel.com>
2021-02-19 07:09:07 -05:00
Jose Alberto Meza
cdfd0a9a8d drivers: espi: config: Support for host-initiated eSPI traffic
Currently assumption is all OOB traffic over eSPI bus is always client
initiated.
Add option for systems where host can initiate OOB traffic.

Signed-off-by: Jose Alberto Meza <jose.a.meza.arellano@intel.com>
2021-02-19 07:09:07 -05:00
Antonis Sioutas
a36d098117 drivers: counter: stm32: Fix alarm time calculation
The calculated alarm time starts from 2000 but the gmtime_r needs as
input the time from epoch (1970). This causes the alarm time to be
miscalculated due to leap years, as 2000 is a leap year and 1970 is not.
To fix the issue, the 2000 timestamp can be added to the input time of
gmtime_r.

Fixes #32260

Signed-off-by: Antonis Sioutas <antonis.si510@gmail.com>
2021-02-19 07:08:53 -05:00
Vinayak Kariappa Chettimada
73a3eba539 tests: Bluetooth: bsim: Enable LL/CON/INI/BV-24-C EDTT test
Enable the LL/CON/INI/BV-24-C [Network Privacy - Connection
Establishment using resolving list, initiator, Ignore
Identity Address, with address resolution disabled] EDTT
test.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-02-19 07:08:43 -05:00
Vinayak Kariappa Chettimada
8ff077a65a Bluetooth: controller: Fix network privacy with resolution disabled
When address resolution is disabled, an identity address has
been added into the resolving list with peer IRK, and device
privacy has not been selected for the peer device then
connection indication shall not be sent to the peer that is
advertising using its identity address.

Fixes #24731.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2021-02-19 07:08:43 -05:00
Torsten Rasmussen
116425ccbe cmake: improved handling of output and byproducts in CMake
Fixes: #23449

This commit adds additional OUTPUT and BYPRODUCTS to custom command
and targets in the Zephyr build system.

This ensures that files produced during the build will be removed again
when invoking ninja clean / make clean.

The generated syscalls headers include folder is added to the syscall
target using ADDITIONAL_CLEAN_FILES property.
However, this property is new in CMake 3.15, so will not work when using
older CMake with ninja.

For CMake versions <=3.15 the ADDITIONAL_MAKE_CLEAN_FILES property is
used. However, this only supports Makefile generator.

Signed-off-by: Torsten Rasmussen <Torsten.Rasmussen@nordicsemi.no>
2021-02-19 07:08:24 -05:00
Erwan Gouriou
ca83ca0065 drivers/uart: stm32: Fix comparisons to have constants on right side
Fixes: "WARNING:CONSTANT_COMPARISON: Comparisons should place the
constant on the right side of the test"

Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-19 07:08:13 -05:00
Erwan Gouriou
213d348b92 drivers/uart: stm32: Report 9bits transactions as not supported
STM32 uart driver doesn't support 9bits transactions in any case,
so remove case were it was declared as supported.

Fixes #31799

Signed-off-by: Erwan Gouriou <erwan.gouriou@linaro.org>
2021-02-19 07:08:13 -05:00
Alexandre Bourdiol
205c16afa6 test: arch: arm: arm_interrupt: clear FPSCR register in ISR
Clear Floating Point Status and Control Register (FPSCR),
to prevent from having the interrupt line set to pending again,
in case FPU IRQ is selected by the test as "Available IRQ line"

Fixes #31982

Signed-off-by: Alexandre Bourdiol <alexandre.bourdiol@st.com>
2021-02-19 07:07:59 -05:00
Johann Fischer
af67c16b9f usb: bluetooth: fix OUT buffer handling
acl_read_cb does not handle incoming (ACL) data
if BT_CTLR_TX_BUFFER_SIZE is greater than
USB_MAX_FS_BULK_MPS - BT_HCI_ACL_HDR_SIZE.

Since the host adjusts the data according to
the BT_CTLR_TX_BUFFER_SIZE and does not use ZLP
we cannot start usb_transfer over the possible length of
the whole packet, with or without USB_TRANS_NO_ZLP flag.
But we can read the packet length from the header and
call net_buf_put() when the whole packet is received.

Fixes: #31922

Reported-by: Matias Karhumaa <matias.karhumaa@gmail.com>
Signed-off-by: Johann Fischer <johann.fischer@nordicsemi.no>
2021-02-17 09:24:22 -05:00
Andy Ross
46d95a23f0 lib/p4wq: Fix race with completed work items
Work items can be legally resubmitted from within their own handler.
Currently the p4wq detects this case by checking their thread field to
see if it's been set to NULL.  But that's a race, because if the item
was NOT resubmitted then it no longer belongs to the queue and may
have been freed or reused or otherwise clobbered legally by user code.

Instead, steal a single bit in the thread struct for this purpose.
This patch adds a K_CALLBACK_STATE bit in user_options and documents
it in such a way (as being intended for "callback manager" utilities)
that it can't be used recursively or otherwise collide.

Fixes #32052

Signed-off-by: Andy Ross <andrew.j.ross@intel.com>
2021-02-17 09:24:00 -05:00
Marcin Niestroj
3de4ba01e4 drivers: wifi: esp: fix hostname configuration
By the time hostname configuration was implemented, driver was switching
only between STA and STA+AP modes. After dynamic selection between NONE,
STA, AP and STA+AP was implemented (commit referenced below), hostname
configuration no longer takes effect when ESP chip obtains address over
DHCP (and sends hostname in the DHCP request).

Set hostname each time after enabling STA mode, so that it takes effect
in DHCP requests.

Fixes: 03ce61004b ("drivers: wifi: esp: control CWMODE depending on
  current needs")
Signed-off-by: Marcin Niestroj <m.niestroj@grinn-global.com>
2021-02-17 09:23:35 -05:00
Matias Karhumaa
6031f04519 Bluetooth: hci_raw: Fix switch fallthrough
Fix apparently accidental switch fallthrough in bt_buf_get_tx().

Signed-off-by: Matias Karhumaa <matias.karhumaa@gmail.com>
2021-02-17 09:23:13 -05:00
62498 changed files with 1113007 additions and 5273985 deletions

32
.buildkite/daily.yml Normal file
View File

@@ -0,0 +1,32 @@
steps:
- command:
- .buildkite/run.sh
env:
ZEPHYR_TOOLCHAIN_VARIANT: "zephyr"
ZEPHYR_SDK_INSTALL_DIR: "/opt/sdk/zephyr-sdk-0.12.2"
parallelism: 400
timeout_in_minutes: 210
retry:
manual: true
plugins:
- docker#v3.5.0:
image: "zephyrprojectrtos/ci:v0.11.13"
propagate-environment: true
volumes:
- "/var/lib/buildkite-agent/git-mirrors:/var/lib/buildkite-agent/git-mirrors"
- "/var/lib/buildkite-agent/zephyr-module-cache:/var/lib/buildkite-agent/zephyr-module-cache"
- "/var/lib/buildkite-agent/zephyr-ccache:/root/.ccache"
workdir: "/workdir/zephyr"
agents:
- "queue=default"
- wait: ~
continue_on_failure: true
- plugins:
- junit-annotate#v1.7.0:
artifacts: twister-*.xml
notify:
- email: "builds+int+399+7809482394022958124@lists.zephyrproject.org"
if: build.state != "passed"

8
.buildkite/hooks/post-command Executable file
View File

@@ -0,0 +1,8 @@
#!/bin/bash
# Copyright (c) 2020 Linaro Limited
#
# SPDX-License-Identifier: Apache-2.0
# report disk usage:
echo "--- $0 disk usage"
df -h

44
.buildkite/hooks/pre-command Executable file
View File

@@ -0,0 +1,44 @@
#!/bin/bash
# Copyright (c) 2020 Linaro Limited
#
# SPDX-License-Identifier: Apache-2.0
# Save off where we started so we can go back there
WORKDIR=${PWD}
echo "--- $0 disk usage"
df -h
du -hs /var/lib/buildkite-agent/*
docker images -a
docker system df -v
if [ -n "${BUILDKITE_PULL_REQUEST_BASE_BRANCH}" ]; then
git fetch -v origin ${BUILDKITE_PULL_REQUEST_BASE_BRANCH}
git checkout FETCH_HEAD
git config --local user.email "builds@zephyrproject.org"
git config --local user.name "Zephyr CI"
git merge --no-edit "${BUILDKITE_COMMIT}" || {
local merge_result=$?
echo "Merge failed: ${merge_result}"
git merge --abort
exit $merge_result
}
fi
mkdir -p /var/lib/buildkite-agent/zephyr-ccache/
# create cache dirs, no-op if they already exist
mkdir -p /var/lib/buildkite-agent/zephyr-module-cache/modules
mkdir -p /var/lib/buildkite-agent/zephyr-module-cache/tools
mkdir -p /var/lib/buildkite-agent/zephyr-module-cache/bootloader
# Clean cache - if it already exists
cd /var/lib/buildkite-agent/zephyr-module-cache
find -type f -not -path "*/.git/*" -not -name ".git" -delete
# Remove any stale locks
find -name index.lock -delete
# return from where we started so we can find pipeline files from
# git repo
cd ${WORKDIR}

28
.buildkite/pipeline.yml Normal file
View File

@@ -0,0 +1,28 @@
steps:
- command:
- .buildkite/run.sh
env:
ZEPHYR_TOOLCHAIN_VARIANT: "zephyr"
ZEPHYR_SDK_INSTALL_DIR: "/opt/sdk/zephyr-sdk-0.12.2"
parallelism: 20
timeout_in_minutes: 180
retry:
manual: true
plugins:
- docker#v3.5.0:
image: "zephyrprojectrtos/ci:v0.11.13"
propagate-environment: true
volumes:
- "/var/lib/buildkite-agent/git-mirrors:/var/lib/buildkite-agent/git-mirrors"
- "/var/lib/buildkite-agent/zephyr-module-cache:/var/lib/buildkite-agent/zephyr-module-cache"
- "/var/lib/buildkite-agent/zephyr-ccache:/root/.ccache"
workdir: "/workdir/zephyr"
agents:
- "queue=default"
- wait: ~
continue_on_failure: true
- plugins:
- junit-annotate#v1.7.0:
artifacts: twister-*.xml

75
.buildkite/run.sh Executable file
View File

@@ -0,0 +1,75 @@
#!/bin/bash
# Copyright (c) 2020 Linaro Limited
#
# SPDX-License-Identifier: Apache-2.0
set -eE
function cleanup()
{
# Rename twister junit xml for use with junit-annotate-buildkite-plugin
# create dummy file if twister did nothing
if [ ! -f twister-out/twister.xml ]; then
touch twister-out/twister.xml
fi
mv twister-out/twister.xml twister-${BUILDKITE_JOB_ID}.xml
buildkite-agent artifact upload twister-${BUILDKITE_JOB_ID}.xml
# Upload test_file to get list of tests that are build/run
if [ -f test_file.txt ]; then
buildkite-agent artifact upload test_file.txt
fi
# ccache stats
echo "--- ccache stats at finish"
ccache -s
# disk usage
echo "--- disk usage at finish"
df -h
}
trap cleanup ERR
echo "--- run $0"
git log -n 5 --oneline --decorate --abbrev=12
# Setup module cache
cd /workdir
ln -s /var/lib/buildkite-agent/zephyr-module-cache/modules
ln -s /var/lib/buildkite-agent/zephyr-module-cache/tools
ln -s /var/lib/buildkite-agent/zephyr-module-cache/bootloader
cd /workdir/zephyr
export JOB_NUM=$((${BUILDKITE_PARALLEL_JOB}+1))
# ccache stats
echo ""
echo "--- ccache stats at start"
ccache -s
if [ -n "${DAILY_BUILD}" ]; then
TWISTER_OPTIONS=" --inline-logs -N --build-only --all --retry-failed 3 -v "
echo "--- DAILY BUILD"
west init -l .
west update 1> west.update.log || west update 1> west.update-2.log
west forall -c 'git reset --hard HEAD'
source zephyr-env.sh
./scripts/twister --subset ${JOB_NUM}/${BUILDKITE_PARALLEL_JOB_COUNT} ${TWISTER_OPTIONS}
else
if [ -n "${BUILDKITE_PULL_REQUEST_BASE_BRANCH}" ]; then
./scripts/ci/run_ci.sh -c -b ${BUILDKITE_PULL_REQUEST_BASE_BRANCH} -r origin \
-m ${JOB_NUM} -M ${BUILDKITE_PARALLEL_JOB_COUNT} -p ${BUILDKITE_PULL_REQUEST}
else
./scripts/ci/run_ci.sh -c -b ${BUILDKITE_BRANCH} -r origin \
-m ${JOB_NUM} -M ${BUILDKITE_PARALLEL_JOB_COUNT};
fi
fi
TWISTER_EXIT_STATUS=$?
cleanup
exit ${TWISTER_EXIT_STATUS}

View File

@@ -5,6 +5,7 @@
--min-conf-desc-length=1
--typedefsfile=scripts/checkpatch/typedefsfile
--ignore BRACES
--ignore PRINTK_WITHOUT_KERN_LEVEL
--ignore SPLIT_STRING
--ignore VOLATILE
@@ -26,7 +27,4 @@
--ignore TRAILING_SEMICOLON
--ignore COMPLEX_MACRO
--ignore MULTISTATEMENT_MACRO_USE_DO_WHILE
--ignore ENOSYS
--ignore IS_ENABLED_CONFIG
--ignore EXPORT_SYMBOL
--ignore COMPARISON_TO_NULL
--exclude ext

View File

@@ -1,52 +1,82 @@
# SPDX-License-Identifier: Apache-2.0
# SPDX-License-Identifier: GPL-2.0
#
# Note: The list of ForEachMacros can be obtained using:
# clang-format configuration file. Intended for clang-format >= 4.
#
# git grep -h '^#define [^[:space:]]*FOR_EACH[^[:space:]]*(' include/ \
# | sed "s,^#define \([^[:space:]]*FOR_EACH[^[:space:]]*\)(.*$, - '\1'," \
# | sort | uniq
# For more information, see:
#
# Documentation/process/clang-format.rst
# https://clang.llvm.org/docs/ClangFormat.html
# https://clang.llvm.org/docs/ClangFormatStyleOptions.html
#
# References:
# - https://clang.llvm.org/docs/ClangFormatStyleOptions.html
---
BasedOnStyle: LLVM
AlignConsecutiveMacros: AcrossComments
AllowShortBlocksOnASingleLine: Never
AccessModifierOffset: -4
AlignAfterOpenBracket: Align
AlignConsecutiveAssignments: false
AlignConsecutiveDeclarations: false
#AlignEscapedNewlines: Left # Unknown to clang-format-4.0
AlignOperands: true
AlignTrailingComments: false
AllowAllParametersOfDeclarationOnNextLine: false
AllowShortBlocksOnASingleLine: false
AllowShortCaseLabelsOnASingleLine: false
AllowShortEnumsOnASingleLine: false
AllowShortFunctionsOnASingleLine: None
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AttributeMacros:
- __aligned
- __deprecated
- __packed
- __printf_like
- __syscall
- __syscall_always_inline
- __subsystem
BitFieldColonSpacing: After
BreakBeforeBraces: Linux
ColumnLimit: 100
AlwaysBreakAfterDefinitionReturnType: None
AlwaysBreakAfterReturnType: None
AlwaysBreakBeforeMultilineStrings: false
AlwaysBreakTemplateDeclarations: false
BinPackArguments: true
BinPackParameters: true
BraceWrapping:
AfterClass: false
AfterControlStatement: false
AfterEnum: false
AfterFunction: true
AfterNamespace: true
AfterObjCDeclaration: false
AfterStruct: false
AfterUnion: false
#AfterExternBlock: false # Unknown to clang-format-5.0
BeforeCatch: false
BeforeElse: false
IndentBraces: false
#SplitEmptyFunction: true # Unknown to clang-format-4.0
#SplitEmptyRecord: true # Unknown to clang-format-4.0
#SplitEmptyNamespace: true # Unknown to clang-format-4.0
BreakBeforeBinaryOperators: None
BreakBeforeBraces: Custom
#BreakBeforeInheritanceComma: false # Unknown to clang-format-4.0
BreakBeforeTernaryOperators: false
BreakConstructorInitializersBeforeComma: false
#BreakConstructorInitializers: BeforeComma # Unknown to clang-format-4.0
BreakAfterJavaFieldAnnotations: false
BreakStringLiterals: false
ColumnLimit: 80
CommentPragmas: '^ IWYU pragma:'
#CompactNamespaces: false # Unknown to clang-format-4.0
ConstructorInitializerAllOnOneLineOrOnePerLine: false
ConstructorInitializerIndentWidth: 8
ContinuationIndentWidth: 8
Cpp11BracedListStyle: false
DerivePointerAlignment: false
DisableFormat: false
ExperimentalAutoDetectBinPacking: false
#FixNamespaceComments: false # Unknown to clang-format-4.0
# Taken from:
# git grep -h '^#define [^[:space:]]*FOR_EACH[^[:space:]]*(' include/ \
# | sed "s,^#define \([^[:space:]]*FOR_EACH[^[:space:]]*\)(.*$, - '\1'," \
# | sort | uniq
ForEachMacros:
- 'ARRAY_FOR_EACH'
- 'ARRAY_FOR_EACH_PTR'
- 'FOR_EACH'
- 'FOR_EACH_FIXED_ARG'
- 'FOR_EACH_IDX'
- 'FOR_EACH_IDX_FIXED_ARG'
- 'FOR_EACH_NONEMPTY_TERM'
- 'FOR_EACH_FIXED_ARG_NONEMPTY_TERM'
- 'RB_FOR_EACH'
- 'RB_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_DLIST_FOR_EACH_NODE'
- 'SYS_DLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SEM_LOCK'
- 'SYS_SFLIST_FOR_EACH_CONTAINER'
- 'SYS_SFLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SFLIST_FOR_EACH_NODE'
@@ -55,62 +85,61 @@ ForEachMacros:
- 'SYS_SLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SLIST_FOR_EACH_NODE'
- 'SYS_SLIST_FOR_EACH_NODE_SAFE'
- '_WAIT_Q_FOR_EACH'
- 'Z_FOR_EACH'
- 'Z_FOR_EACH_ENGINE'
- 'Z_FOR_EACH_EXEC'
- 'Z_FOR_EACH_FIXED_ARG'
- 'Z_FOR_EACH_FIXED_ARG_EXEC'
- 'Z_FOR_EACH_IDX'
- 'Z_FOR_EACH_IDX_EXEC'
- 'Z_FOR_EACH_IDX_FIXED_ARG'
- 'Z_FOR_EACH_IDX_FIXED_ARG_EXEC'
- 'Z_GENLIST_FOR_EACH_CONTAINER'
- 'Z_GENLIST_FOR_EACH_CONTAINER_SAFE'
- 'Z_GENLIST_FOR_EACH_NODE'
- 'Z_GENLIST_FOR_EACH_NODE_SAFE'
- 'STRUCT_SECTION_FOREACH'
- 'STRUCT_SECTION_FOREACH_ALTERNATE'
- 'TYPE_SECTION_FOREACH'
- 'K_SPINLOCK'
- 'COAP_RESOURCE_FOREACH'
- 'COAP_SERVICE_FOREACH'
- 'COAP_SERVICE_FOREACH_RESOURCE'
- 'HTTP_RESOURCE_FOREACH'
- 'HTTP_SERVER_CONTENT_TYPE_FOREACH'
- 'HTTP_SERVICE_FOREACH'
- 'HTTP_SERVICE_FOREACH_RESOURCE'
- 'I3C_BUS_FOR_EACH_I3CDEV'
- 'I3C_BUS_FOR_EACH_I2CDEV'
- 'MIN_HEAP_FOREACH'
IfMacros:
- 'CHECKIF'
# Disabled for now, see bug https://github.com/zephyrproject-rtos/zephyr/issues/48520
#IncludeBlocks: Regroup
- '_WAIT_Q_FOR_EACH'
#IncludeBlocks: Preserve # Unknown to clang-format-5.0
IncludeCategories:
- Regex: '^".*\.h"$'
Priority: 0
- Regex: '^<(assert|complex|ctype|errno|fenv|float|inttypes|limits|locale|math|setjmp|signal|stdarg|stdbool|stddef|stdint|stdio|stdlib|string|tgmath|time|wchar|wctype)\.h>$'
Priority: 1
- Regex: '^\<zephyr/.*\.h\>$'
Priority: 2
- Regex: '.*'
Priority: 3
Priority: 1
IncludeIsMainRegex: '(Test)?$'
IndentCaseLabels: false
IndentGotoLabels: false
#IndentPPDirectives: None # Unknown to clang-format-5.0
IndentWidth: 8
InsertBraces: true
InsertNewlineAtEOF: true
SpaceBeforeInheritanceColon: False
SpaceBeforeParens: ControlStatementsExceptControlMacros
SortIncludes: Never
UseTab: ForContinuationAndIndentation
WhitespaceSensitiveMacros:
- COND_CODE_0
- COND_CODE_1
- IF_DISABLED
- IF_ENABLED
- LISTIFY
- STRINGIFY
- Z_STRINGIFY
- DT_FOREACH_PROP_ELEM_SEP
IndentWrappedFunctionNames: false
JavaScriptQuotes: Leave
JavaScriptWrapImports: true
KeepEmptyLinesAtTheStartOfBlocks: false
MacroBlockBegin: ''
MacroBlockEnd: ''
MaxEmptyLinesToKeep: 1
NamespaceIndentation: Inner
#ObjCBinPackProtocolList: Auto # Unknown to clang-format-5.0
ObjCBlockIndentWidth: 8
ObjCSpaceAfterProperty: true
ObjCSpaceBeforeProtocolList: true
# Taken from git's rules
#PenaltyBreakAssignment: 10 # Unknown to clang-format-4.0
PenaltyBreakBeforeFirstCallParameter: 30
PenaltyBreakComment: 10
PenaltyBreakFirstLessLess: 0
PenaltyBreakString: 10
PenaltyExcessCharacter: 100
PenaltyReturnTypeOnItsOwnLine: 60
PointerAlignment: Right
ReflowComments: false
SortIncludes: false
#SortUsingDeclarations: false # Unknown to clang-format-4.0
SpaceAfterCStyleCast: false
SpaceAfterTemplateKeyword: true
SpaceBeforeAssignmentOperators: true
#SpaceBeforeCtorInitializerColon: true # Unknown to clang-format-5.0
#SpaceBeforeInheritanceColon: true # Unknown to clang-format-5.0
SpaceBeforeParens: ControlStatements
#SpaceBeforeRangeBasedForLoopColon: true # Unknown to clang-format-5.0
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 1
SpacesInAngles: false
SpacesInContainerLiterals: false
SpacesInCStyleCastParentheses: false
SpacesInParentheses: false
SpacesInSquareBrackets: false
Standard: Cpp03
TabWidth: 8
UseTab: Always
...

View File

@@ -1,21 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Copyright (c) 2024, Basalte bv
analyzer:
# Start by disabling all
- --disable-all
# Enable the sensitive profile
- --enable=sensitive
# Disable unused cases
- --disable=boost
- --disable=mpi
# Many identifiers in zephyr start with _
- --disable=clang-diagnostic-reserved-identifier
- --disable=clang-diagnostic-reserved-macro-identifier
# Cleanup
- --clean

View File

@@ -12,10 +12,10 @@ coverage:
patch: yes
changes: no
# ignore:
# - "tests/**/*"
# - "samples/**/*"
# - "ext/hal/**/*"
#ignore:
# - "tests/**/*"
# - "samples/**/*"
# - "ext/hal/**/*"
parsers:
gcov:

View File

@@ -9,7 +9,7 @@ charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
max_line_length = 100
max_line_length = 80
# Assembly
[*.S]
@@ -21,16 +21,6 @@ indent_size = 8
indent_style = tab
indent_size = 8
# C++
[*.{cpp,hpp}]
indent_style = tab
indent_size = 8
# Linker Script
[*.ld]
indent_style = tab
indent_size = 8
# Python
[*.py]
indent_style = space
@@ -41,11 +31,6 @@ indent_size = 4
indent_style = tab
indent_size = 8
# reStructuredText
[*.rst]
indent_style = space
indent_size = 3
# YAML
[*.{yml,yaml}]
indent_style = space
@@ -75,7 +60,6 @@ indent_size = 2
# Makefile
[Makefile]
indent_style = tab
indent_size = 8
# Device tree
[*.{dts,dtsi,overlay}]
@@ -84,13 +68,8 @@ indent_size = 8
# Git commit messages
[COMMIT_EDITMSG]
max_line_length = 75
# Patches
[{*.patch,*.diff}]
trim_trailing_whitespace = false
max_line_length = 72
# Kconfig
[Kconfig*]
indent_style = tab
indent_size = 8
indent_style=tab

6
.gitattributes vendored
View File

@@ -4,10 +4,6 @@
.gitignore export-ignore
.mailmap export-ignore
# Tell git to not diff certain files
*.svg -diff
# Tell linguist that generated test pattern files should not be included in the
# language statistics.
*.pat linguist-generated
*.svg linguist-generated
*.pat linguist-generated=true

View File

@@ -1,85 +0,0 @@
name: Bug Report
description: File a bug report.
labels: ["bug"]
type: "Bug"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
- type: textarea
id: what-happened
attributes:
label: Describe the bug
description: |
A clear and concise description of what the bug is.
placeholder: |
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
validations:
required: true
- type: checkboxes
id: regression
attributes:
label: Regression
description: |
Check this box if this is a regression and provide a SHA if you were able to "git bisect" to a specific commit.
options:
- label: This is a regression.
required: false
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce
description: |
Steps to reproduce the behavior.
placeholder: |
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
validations:
required: false
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
render: shell
- type: dropdown
attributes:
label: Impact
description: Impact of this bug
multiple: false
options:
- Showstopper Prevents release or major functionality; system unusable.
- Major Severely degrades functionality; workaround is difficult or unavailable.
- Functional Limitation Some features not working as expected, but system usable.
- Annoyance Minor irritation; no significant impact on usability or functionality.
- Intermittent Occurs occasionally; hard to reproduce.
- Not sure
default: 3
validations:
required: true
- type: textarea
id: env
attributes:
label: Environment
description: please complete the following information
placeholder: |
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
- type: textarea
id: context
attributes:
label: Additional Context
description: Provide other context that could be relevant to the bug, such as pin setting, target configuration,etc.

View File

@@ -1,42 +0,0 @@
name: Enhancement
description: Submit an Enhancement
labels: ["Enhancement"]
type: "Enhancement"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this enhancement proposal.
- type: textarea
id: description
attributes:
label: Summary
description: |
Is your enhancement proposal related to a problem? Please describe.
placeholder: |
A clear and concise description of what the problem is.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: |
Describe the solution you'd like
placeholder: |
A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Alternatives
description: Describe alternatives you've considered
placeholder: |
A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
id: context
attributes:
label: Additional Context
description: Add any other context or graphics (drag-and-drop an image) about the enhancement here.

View File

@@ -1,75 +0,0 @@
name: RFC / Proposal
description: Submit a Proposal (RFC)
labels: ["RFC"]
type: RFC
assignees: []
body:
- type: markdown
attributes:
value: |
## Introduction
This section targets end users, TSC members, maintainers and anyone else
that might need a quick explanation of your proposed change.
- type: textarea
id: problem-description
attributes:
label: Problem Description
description: Why do we want this change and what problem are we trying to address?
placeholder: Explain the problem or limitation this RFC is meant to resolve.
validations:
required: true
- type: textarea
id: proposed-change-summary
attributes:
label: Proposed Change (Summary)
description: A high-level summary of the proposed change.
placeholder: Brief summary of what will change if this RFC is implemented.
validations:
required: true
- type: markdown
attributes:
value: |
## Detailed RFC
This section targets the development team. Upon reading it, each engineer
should understand what must be done to implement the proposed feature.
- type: textarea
id: detailed-change
attributes:
label: Proposed Change (Detailed)
description: Describe the change in as much detail as possible. Include context or background info, and reuse of existing components if applicable.
placeholder: Explain exactly what youre planning to change and how.
validations:
required: true
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: Highlight how this change may affect the rest of the project or other teams/components.
placeholder: List components, modules, or teams affected.
validations:
required: false
- type: textarea
id: concerns
attributes:
label: Concerns and Unresolved Questions
description: List any concerns, unknowns, or unresolved questions related to this proposal.
placeholder: Any areas of uncertainty?
validations:
required: false
- type: textarea
id: alternatives
attributes:
label: Alternatives Considered
description: What alternative solutions were considered? Why was this proposal chosen?
placeholder: List alternatives and explain the rationale behind your choice.
validations:
required: false

View File

@@ -1,29 +0,0 @@
name: Feature Request
description: Suggest a new feature or enhancement
labels: ["Feature Request"]
type: Feature
assignees: []
body:
- type: textarea
id: problem
attributes:
label: Is your feature request related to a problem? Please describe.
description: A clear and concise description of what the problem is.
placeholder: e.g., I'm frustrated when I need to do X manually because Y is missing.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: e.g., It would be great if the system could automatically handle X by doing Y.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Describe alternatives you've considered
description: Include any alternative solutions or features

View File

@@ -1,57 +0,0 @@
name: Contributor Nomination
description: Nominate a GitHub user for the Contributor role with triage permissions
labels: [Role Nomination]
assignees: ['nashif']
body:
- type: markdown
attributes:
value: |
## Background
The [TSC Project Roles](https://docs.zephyrproject.org/latest/project/project_roles.html) defines the main roles for the Zephyr Project, including Maintainer, Collaborator, and Contributor.
By default, anyone who contributes code or documentation is a Contributor, but with the lowest [GitHub Permission Level](https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization) of **Read**.
Use this form to nominate a user for the **Contributor** role with **Triage** permission, which allows the user to:
- Add reviewers to pull requests
- Be added as a reviewer by others
- type: input
id: full-name
attributes:
label: Full Name
description: Full name of the nominated contributor.
placeholder: e.g., Jane Doe
validations:
required: true
- type: input
id: github-username
attributes:
label: GitHub Username
description: GitHub handle of the nominated contributor.
placeholder: e.g., @janedoe
validations:
required: true
- type: input
id: organization
attributes:
label: Organization
description: Organization the nominee is affiliated with (optional).
placeholder: e.g., Acme Corp
validations:
required: false
- type: textarea
id: supporting-documents
attributes:
label: Supporting Documents
description: Provide links to 35 pull requests authored or reviewed by the nominee that demonstrate their dedication to the Zephyr project.
placeholder: |
e.g.,
- https://github.com/zephyrproject-rtos/zephyr/pull/12345
- https://github.com/zephyrproject-rtos/zephyr/pull/23456
- https://github.com/zephyrproject-rtos/zephyr/pull/34567
validations:
required: true

View File

@@ -1,106 +0,0 @@
name: External Component Integration
description: Propose integration of an external open source component
labels: ["TSC"]
assignees: []
body:
- type: textarea
id: origin
attributes:
label: Origin
description: Name of project hosting the original open source code. Provide a link to the source.
placeholder: e.g., SQLite - https://sqlite.org
validations:
required: true
- type: textarea
id: purpose
attributes:
label: Purpose
description: Brief description of what this software does.
placeholder: |
e.g., A small, fast, self-contained SQL database engine.
validations:
required: true
- type: textarea
id: integration-mode
attributes:
label: Mode of Integration
description: Should this be integrated in the main tree or as a module? Explain your choice and suggest a module repo name if applicable.
placeholder: |
e.g., As a module - proposed repo name: zephyr-sqlite
validations:
required: true
- type: textarea
id: maintainership
attributes:
label: Maintainership
description: List maintainers (GitHub IDs) for this integration. Include at least one primary maintainer.
placeholder: |
e.g., @username1 (primary), @username2 (collaborator)
validations:
required: true
- type: input
id: pull-request
attributes:
label: Pull Request
description: Link to the pull request (if any) for this integration. Must be labeled "DNM" (Do Not Merge).
placeholder: |
e.g., https://github.com/zephyrproject-rtos/zephyr/pull/12345
validations:
required: false
- type: textarea
id: description
attributes:
label: Description
description: Long-form description to justify suitability of this component.
placeholder: |
- What is its primary functionality?
- What problem does it solve?
- Why is this the right component?
validations:
required: true
- type: textarea
id: security
attributes:
label: Security
description: Security-related aspects of this component, including cryptographic functions and known vulnerabilities.
placeholder: |
- Does it use cryptography?
- How are vulnerabilities handled?
- Any known CVEs?
validations:
required: false
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: What does this component depend on, and how will it be integrated (directly or via abstraction)?
placeholder: |
- Other external packages?
- Direct or abstracted use in Zephyr?
validations:
required: false
- type: input
id: revision
attributes:
label: Version or SHA
description: Which version or specific commit should be initially integrated?
placeholder: e.g., v3.45.0 or 79cc94d
validations:
required: true
- type: input
id: license
attributes:
label: License (SPDX)
description: Provide the license using a valid SPDX identifier (e.g., BSD-3-Clause).
placeholder: e.g., MIT or BSD-3-Clause
validations:
required: true

40
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,40 @@
---
name: Bug report
about: Create a report to help us improve Zephyr
title: ''
labels: bug
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
What have you tried to diagnose or workaround this issue?
**To Reproduce**
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Impact**
What impact does this issue have on your progress (e.g., annoyance, showstopper)
**Logs and console output**
If applicable, add console logs or other types of debug information
e.g Wireshark capture or Logic analyzer capture (upload in zip archive).
copy-and-paste text and put a code fence (\`\`\`) before and after, to help
explain the issue. (if unable to obtain text log, add a screenshot)
**Environment (please complete the following information):**
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
**Additional context**
Add any other context about the problem here.

View File

@@ -1,5 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Zephyr Community Support
url: https://github.com/zephyrproject-rtos/zephyr/discussions
about: Please ask and answer questions here.

20
.github/ISSUE_TEMPLATE/enhancement.md vendored Normal file
View File

@@ -0,0 +1,20 @@
---
name: Enhancement
about: Suggest enhancements to existing features
title: ''
labels: Enhancement
assignees: ''
---
**Is your enhancement proposal related to a problem? Please describe.**
A clear and concise description of what the problem is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or graphics (drag-and-drop an image) about the feature request here.

View File

@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: feature request
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or graphics (drag-and-drop an image) about the feature request here.

View File

@@ -0,0 +1,19 @@
---
name: Hardware Support
about: Suggest adding hardware support
title: ''
labels: Hardware Support
assignees: ''
---
**Is this request related to a missing driver support for a particular hardware platform, SoC or board? Please describe.**
Describe in details the hardware support being requested and why this support benefits Zephyr.
**Describe why you are asking for this support?**
Describe why you are asking for this support instead of contributing it directly to the tree
If this is a new board or SoC, please state whether you are willing to maintain the Zephyr support for it if it is included in the main tree
**Additional context**
Add any other context or graphics (drag-and-drop an image) about the hardware here.

51
.github/ISSUE_TEMPLATE/rfc-proposal.md vendored Normal file
View File

@@ -0,0 +1,51 @@
---
name: RFC / Proposal
about: Submit an RFC / Proposal
title: ''
labels: RFC
assignees: ''
---
## Introduction
This section targets end users, TSC members, maintainers and anyone else that might
need a quick explanation of your proposed change.
### Problem description
Why do we want this change and what problem are we trying to address?
### Proposed change
A brief summary of the proposed change - the 10,000 ft view on what it will
change once this change is implemented.
## Detailed RFC
In this section of the document the target audience is the dev team. Upon
reading this section each engineer should have a rather clear picture of what
needs to be done in order to implement the described feature.
### Proposed change (Detailed)
This section is freeform - you should describe your change in as much detail
as possible. Please also ensure to include any context or background info here.
For example, do we have existing components which can be reused or altered.
By reading this section, each team member should be able to know what exactly
you're planning to change and how.
### Dependencies
Highlight how the change may affect the rest of the project (new components,
modifications in other areas), or other teams/projects.
### Concerns and Unresolved Questions
List any concerns, unknowns, and generally unresolved questions etc.
## Alternatives
List any alternatives considered, and the reasons for choosing this option
over them.

22
.github/SECURITY.md vendored
View File

@@ -1,22 +0,0 @@
# Security Policy
## Supported versions
The Zephyr project supports the following versions with security
updates:
- The most recent release, and the release prior to that.
- Active LTS releases.
At this time, with the latest release of v4.3, the supported
versions are:
- v4.3: Current release
- v4.2: Prior release
- v3.7: Current LTS
## Reporting process
Please see our [Security Vulnerability
Reporting](https://docs.zephyrproject.org/latest/security/reporting.html)
page for details on the process.

View File

@@ -1,2 +0,0 @@
paths:
- .github

View File

@@ -1,2 +0,0 @@
paths:
- doc

View File

@@ -1,30 +0,0 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
cooldown:
default-days: 7
commit-message:
prefix: "ci: github: "
labels: []
groups:
actions-deps:
patterns:
- "*"
- package-ecosystem: "uv"
directory: "/doc"
schedule:
interval: "weekly"
cooldown:
default-days: 7
commit-message:
prefix: "ci: doc: "
labels: []
groups:
doc-deps:
patterns:
- "*"

161
.github/labeler.yml vendored Normal file
View File

@@ -0,0 +1,161 @@
"Release Notes":
- "doc/releases/**/*"
"area: Modem":
- "drivers/modem/**/*"
"area: PWM":
- "drivers/pwm/**/*"
"area: Watchdog":
- "drivers/watchdog/**/*"
"area: Sensors":
- "drivers/sensors/**/*"
"area: ADC":
- "drivers/adc/**/*"
"area: Counter":
- "drivers/counter/**/*"
"area: CAN":
- "include/drivers/can.h"
- "include/canbus/*/**"
- "drivers/can/**/*"
- "subsys/canbus/*/**"
"area: EEPROM":
- "include/drivers/eeprom.h"
- "drivers/eeprom/**/*"
"area: Timer":
- "drivers/timer/**/*"
"area: I2S":
- "drivers/i2s/**/*"
"area: C Library":
- "lib/libc/**/*"
"area: Devicetree":
- "dts/**/*"
- "**/*.dts"
- "**/*.dtsi"
- "include/devicetree.h"
- "include/devicetree/*"
- "doc/guides/dts/**/*"
"area: Devicetree Binding":
- "include/dt-bindings/**/*"
- "dts/bindings/**/*"
"area: Devicetree Tooling":
- "scripts/dts/**/*"
"area: I2C":
- "drivers/i2c/**/*"
"area: SPI":
- "drivers/spi/**/*"
"area: Boards":
- "boards/**/*"
"area: POSIX":
- "lib/posix/**/*"
"area: native port":
- "arch/posix/**/*"
- "include/arch/posix/**/*"
- "soc/posix/**/*"
- "**/*native_posix*"
"area: X86":
- "arch/x86/**/*"
- "include/arch/x86/**/*"
"area: ARM":
- "arch/arm/**/*"
- "include/arch/arm/**/*"
"area: ARM_64":
- "arch/arm/core/aarch64/**/*"
- "include/arch/arm/aarch64/**/*"
"area: NIOS2":
- "arch/nios2/**/*"
- "include/arch/nios2/**/*"
"area: Xtensa":
- "arch/xtensa/**/*"
- "include/arch/xtensa/**/*"
"area: RISCv32/64":
- "arch/risv/**/*"
- "include/arch/riscv/**/*"
"area: ARC":
- "arch/arc/**/*"
- "include/arch/arc/**/*"
"area: Networking":
- "subsys/net/**/*"
- "samples/net/**/*"
- "tests/net/**/*"
- "include/net/**/*"
- "include/drivers/ieee802154/**/*"
- "drivers/ethernet/**/*"
- "drivers/ieee802154/**/*"
- "drivers/wifi/**/*"
- "drivers/ptp_clock/**/*"
- "drivers/net/**/*"
"area: Logging":
- "subsys/logging/**/*"
"area: Shell":
- "subsys/shell/**/*"
"area: Console":
- "subsys/console/**/*"
"area: Test Framework":
- "subsys/testsuite/**/*"
"area: Settings":
- "subsys/settings/**/*"
"area: File System":
- "subsys/fs/**/*"
"area: Storage":
- "subsys/storage/**/*"
"area: Bluetooth":
- "subsys/bluetooth/**/*"
- "**/*bluetooth*"
"area: Bluetooth Mesh":
- "subsys/bluetooth/mesh/**/*"
"area: Bluetooth Controller":
- "subsys/bluetooth/controller/**/*"
"area: Bluetooth Host":
- "subsys/bluetooth/host/**/*"
- "subsys/bluetooth/services/**/*"
"area: API":
- "include/**/*"
"area: Samples":
- "samples/**/*"
"area: Tests":
- "tests/**/*"
"area: Kernel":
- "kernel/**/*"
- "tests/kernel/**/*"
"area: Documentation":
- "**/*.rst"
- "**/*.md"
"area: Build System":
- "cmake/**/*"
- "CmakeLists.txt"
"area: Kconfig":
- "scripts/kconfig/**/*"
- "Kconfig"
- "Kconfig.zephyr"
"area: Twister":
- "scripts/twister"
- "scripts/pylib/twister/**/*"
"area: Modules":
- "west.yml"
- "modules/**/*"
"area: Shields":
- "boards/shields/**"
- "samples/shields/**"
"platform: NXP":
- "boards/arm/frdm*/**"
- "boards/arm/hexiwear*/**"
- "boards/arm/lpcxpresso*/**"
- "boards/arm/*imx*/**"
- "drivers/**/*imx*"
- "drivers/**/*mcux*"
- "dts/arm/nxp/*/*"
- "dts/bindings/**/nxp*"
- "soc/arm/nxp*/**"
"platform: STM32":
- "boards/arm/nucleo_*/**"
- "boards/arm/*stm32*/**"
- "drivers/**/*stm32*"
- "dts/arm/st/*/*"
- "include/drivers/*/*stm32*"
- "soc/arm/st_stm32/**"
"platform: SiLabs":
- "boards/arm/efr32_*/**/*"
- "boards/arm/efm32_*/**/*"
- "drivers/**/*gecko*"
- "dts/arm/silabs/**/*"
- "dts/bindings/**/silabs,gecko*"
- "soc/arm/silabs_exx32/**/*"

View File

@@ -1,88 +0,0 @@
name: Pull Request/Issue Assigner
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- ready_for_review
branches:
- main
- collab-*
- v*-branch
issues:
types:
- labeled
permissions:
contents: read
jobs:
assignment:
name: Pull Request/Issue Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-24.04
permissions:
issues: write # to add assignees to issues
steps:
- name: Check out source code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: zephyrproject-rtos/action-python-env@32e53bef090c33d53aa94f1d9a9d29c93cfdc5f7 # main
with:
python-version: 3.12
- name: Fetch west.yml/Maintainer.yml from pull request
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
run: |
git fetch origin pull/${{ github.event.pull_request.number }}/merge
git show FETCH_HEAD:west.yml > pr_west.yml
git show FETCH_HEAD:MAINTAINERS.yml > pr_MAINTAINERS.yml
- name: west setup
if: >
github.event_name == 'pull_request_target'
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"
FLAGS+=" -r ${{ github.event.repository.name }}"
FLAGS+=" -M MAINTAINERS.yml"
if [ "${{ github.event_name }}" = "pull_request_target" ]; then
if [ "${{ github.base_ref }}" = "main" ]; then
FLAGS+=" -P ${{ github.event.pull_request.number }} --updated-manifest pr_west.yml --updated-maintainer-file pr_MAINTAINERS.yml"
else
FLAGS+=" -P ${{ github.event.pull_request.number }}"
fi
elif [ "${{ github.event_name }}" = "issues" ]; then
FLAGS+=" -I ${{ github.event.issue.number }}"
elif [ "${{ github.event_name }}" = "schedule" ]; then
FLAGS+=" --modules"
else
echo "Unknown event: ${{ github.event_name }}"
exit 1
fi
python3 scripts/ci/set_assignees.py $FLAGS
- name: Check maintainer file changes
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
python ./scripts/ci/check_maintainer_changes.py \
--repo zephyrproject-rtos/zephyr MAINTAINERS.yml pr_MAINTAINERS.yml

View File

@@ -4,35 +4,13 @@ on:
types:
- closed
- labeled
branches:
- main
permissions:
contents: read
jobs:
backport:
runs-on: ubuntu-18.04
name: Backport
runs-on: ubuntu-24.04
permissions:
contents: write # to create/push backport branches
pull-requests: write # to create backport PRs
issues: write # to add labels to issue created if backport fails
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
github.event.pull_request.merged &&
(
github.event.action == 'closed' ||
(
github.event.action == 'labeled' &&
contains(github.event.label.name, 'backport')
)
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@7e74f601d11eaca577742445e87775b5651a965f # v2.0.3-3
uses: zephyrproject-rtos/action-backport@v1.1.99
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

View File

@@ -1,50 +0,0 @@
name: Backport Issue Check
on:
pull_request_target:
types:
- edited
- opened
- reopened
- synchronize
branches:
- v*-branch
permissions:
contents: read
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
issues: read # to check if associated issue exists for backport
steps:
- name: Check out source code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -1,34 +0,0 @@
name: Publish BabbleSim Tests Results
on:
workflow_run:
workflows: ["BabbleSim Tests"]
types:
- completed
permissions:
contents: read
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-24.04
if: github.event.workflow_run.conclusion != 'skipped'
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@27d65e188ec43221b20d26de30f4892fad91df2f # v2.22.0
with:
check_name: BabbleSim Test Results
comment_mode: off
commit: ${{ github.event.workflow_run.head_sha }}
event_file: event/event.json
event_name: ${{ github.event.workflow_run.event }}
files: "bsim-test-results/**/bsim_results.xml"

View File

@@ -1,212 +0,0 @@
name: BabbleSim Tests
on:
pull_request:
paths:
- ".github/workflows/bsim-tests.yaml"
- ".github/workflows/bsim-tests-publish.yaml"
- "west.yml"
- "subsys/bluetooth/**"
- "tests/bsim/**"
- "boards/nordic/nrf5*/*dt*"
- "dts/*/nordic/**"
- "tests/bluetooth/**"
- "samples/bluetooth/**"
- "boards/native/**"
- "soc/native/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/hal_nordic/**"
- "modules/mbedtls/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
- "include/zephyr/net/openthread.h"
- "drivers/ieee802154/**"
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
- '!**.rst'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
bsim-test:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Environment Setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Check common triggering files
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/native/
soc/native/
arch/posix/
include/zephyr/arch/posix/
scripts/native_simulator/
tests/bsim/*
boards/nordic/nrf5*/*dt*
dts/*/nordic/
modules/mbedtls/**
modules/hal_nordic/**
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
id: check-bluetooth-files
with:
files: |
samples/bluetooth/
subsys/bluetooth/
tests/bluetooth/
tests/bsim/bluetooth/
- name: Check if Networking files changed
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
id: check-networking-files
with:
files: |
tests/bsim/net/
samples/net/sockets/echo_*/
modules/openthread/
subsys/net/l2/openthread/
include/zephyr/net/openthread.h
drivers/ieee802154/
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
id: check-uart-files
with:
files: |
tests/bsim/drivers/uart/
drivers/serial/*nrfx*
tests/drivers/uart/
- name: Update BabbleSim to manifest revision
if: >
steps.check-bluetooth-files.outputs.any_modified == 'true'
|| steps.check-networking-files.outputs.any_modified == 'true'
|| steps.check-uart-files.outputs.any_modified == 'true'
|| steps.check-common-files.outputs.any_modified == 'true'
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Bluetooth Tests with BSIM
if: steps.check-bluetooth-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.bt.sh
- name: Run Networking Tests with BSIM
if: steps.check-networking-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.net.sh
- name: Run UART Tests with BSIM
if: steps.check-uart-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.uart.sh
- name: Merge Test Results
run: |
junitparser merge --glob "./bsim_*/*bsim_results.*.xml" "./twister-out/twister.xml" junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@27d65e188ec43221b20d26de30f4892fad91df2f # v2.22.0
with:
check_name: Bsim Test Results
files: "junit.xml"
comment_mode: off
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: event
path: |
${{ github.event_path }}

View File

@@ -1,76 +0,0 @@
# Copyright (c) 2021, 2022 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Make a snapshot of open bugs as a python pickle file, compressed
# using xz. Upload the xz file to Amazon S3.
name: Bug Snapshot
on:
workflow_dispatch:
schedule:
# Run daily at 00:05
- cron: '5 00 * * *'
permissions:
contents: read
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-24.04
if: github.repository_owner == 'zephyrproject-rtos' && github.ref == 'refs/heads/main'
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Snapshot bugs
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BUGS_PICKLE_FILENAME="zephyr-bugs-$(date -I).pickle.xz"
BUGS_PICKLE_PATH="${RUNNER_TEMP}/${BUGS_PICKLE_FILENAME}"
set -euo pipefail
python3 scripts/make_bugs_pickle.py | xz > ${BUGS_PICKLE_PATH}
echo "BUGS_PICKLE_FILENAME=${BUGS_PICKLE_FILENAME}" >> ${GITHUB_ENV}
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
run: |
REPOSITORY_NAME="${GITHUB_REPOSITORY#*/}"
PUBLISH_BUCKET="builds.zephyrproject.org"
PUBLISH_DOMAIN="builds.zephyrproject.io"
if [ "${{github.event_name}}" = "schedule" ]; then
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot/daily"
else
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot"
fi
PUBLISH_UPLOAD_URI="s3://${PUBLISH_BUCKET}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
PUBLISH_DOWNLOAD_URI="https://${PUBLISH_DOMAIN}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
aws s3 cp --quiet ${BUGS_PICKLE_PATH} ${PUBLISH_UPLOAD_URI}
echo "Bug pickle is available at: ${PUBLISH_DOWNLOAD_URI}" >> ${GITHUB_STEP_SUMMARY}

View File

@@ -1,184 +0,0 @@
name: Build with Clang/LLVM
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
clang-build:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: [1, 2]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config --global update.narrow true
west config manifest.group-filter -- +ci,+optional
# In some cases modules are left in a state where they can't be
# updated (i.e. when we cancel a job and the builder is killed),
# So first retry to update, if that does not work, remove all modules
# and start over. (Workaround until we implement more robust module
# west caching).
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west2.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
${LLVM_TOOLCHAIN_PATH}/bin/clang --version
gcc --version
ls -la
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Tests with Twister
id: twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=llvm
./scripts/twister -p native_sim --force-color --inline-logs -M -N -v --retry-failed 2 \
-T tests --subset ${{matrix.subset}}/2 -j 16
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
path: |
twister-out/twister.xml
twister-out/twister.json
if-no-files-found: ignore
clang-build-results:
name: "Publish Unit Tests Results"
needs: clang-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create GitHub annotations
if: (success() || failure())
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
path: artifacts
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit-clang.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit-clang.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@27d65e188ec43221b20d26de30f4892fad91df2f # v2.22.0
if: always()
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off

View File

@@ -1,275 +0,0 @@
name: Code Coverage with codecov
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
codecov:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["mps2/an385", "native_sim", "qemu_x86", "unit_testing"]
include:
- platform: 'mps2/an385'
normalized: 'mps2_an385'
- platform: 'native_sim'
normalized: 'native_sim'
- platform: 'qemu_x86'
normalized: 'qemu_x86'
- platform: 'unit_testing'
normalized: 'unit_testing'
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resovle the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west update 1> west.update.log || west update 1> west.update-2.log
- name: Environment Setup
run: |
cmake --version
gcc --version
ls -la
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
./scripts/twister --save-tests ${{matrix.normalized}}-testplan.json
ls -la
./scripts/twister \
-i --force-color -N -v --filter runnable -p ${{ matrix.platform }} --coverage \
-T tests --coverage-tool gcovr -xCONFIG_TEST_EXTRA_STACK_SIZE=4096 -e nano \
--timeout-multiplier 2
- name: Build Doxygen Coverage
if: matrix.platform == 'unit_testing'
run: |
pip install -r doc/requirements.txt --require-hashes
sudo apt-get update
sudo apt-get install -y graphviz # dot is needed but currently missing from the Docker image
cmake -B doc/_build -S doc
cmake --build doc/_build --target doxygen-coverage
- name: Upload Doxygen Coverage Results
if: matrix.platform == 'unit_testing'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: doxygen-coverage-results
path: |
doc/_build/new.info
doc/_build/coverage-report
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Rename coverage files
if: always()
run: |
mv twister-out/coverage.json coverage/reports/${{matrix.normalized}}.json
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Coverage Data (Subset ${{ matrix.normalized }})
path: |
coverage/reports/${{ matrix.normalized }}.json
${{ matrix.normalized }}-testplan.json
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-24.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
path: coverage/reports
- name: Move coverage files
run: |
ls -lRt ./coverage/reports
mv ./coverage/reports/*/*testplan.json .
mv ./coverage/reports/*/coverage/reports/*.json ./coverage/reports
ls -la ./coverage/reports
- name: Generate list of coverage files
id: get-coverage-files
shell: cmake -P {0}
run: |
file(GLOB INPUT_FILES_LIST "coverage/reports/*.json")
set(MERGELIST "")
set(FILELIST "")
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(FILELIST STREQUAL "")
set(FILELIST "${f}")
else()
set(FILELIST "${FILELIST},${f}")
endif()
endforeach()
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(MERGELIST STREQUAL "")
set(MERGELIST "--add-tracefile ${f}")
else()
set(MERGELIST "${MERGELIST} -a ${f}")
endif()
endforeach()
file(APPEND $ENV{GITHUB_OUTPUT} "mergefiles=${MERGELIST}\n")
file(APPEND $ENV{GITHUB_OUTPUT} "covfiles=${FILELIST}\n")
- name: Merge coverage files
run: |
pushd ./coverage/reports
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
popd
- name: Get current date
id: run_date
run: |
echo "run_date=$(date --iso-8601=minutes)" >> "$GITHUB_OUTPUT"
echo "run_date_short=$(date +'%Y-%m-%d')" >> "$GITHUB_OUTPUT"
echo "run_date_year=$(date +'%Y')" >> "$GITHUB_OUTPUT"
echo "run_date_month=$(date +'%m')" >> "$GITHUB_OUTPUT"
- name: Generate Coverage Report
if: always()
run: |
python3 ./scripts/ci/coverage/coverage_analysis.py \
-t native_sim-testplan.json \
-m MAINTAINERS.yml \
-c coverage/reports/merged.json \
-o coverage-report-${{ steps.run_date.outputs.run_date_short }} \
-f all
cp coverage-report-* coverage/reports/
- name: Upload Merged Coverage Results and Report
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Coverage Data and report
path: |
coverage/reports/merged.json
coverage/reports/merged.xml
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.json
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.xlsx
- name: Upload test coverage to Codecov
if: always()
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/merged.xml
flags: unittests-coverage
- name: Upload Doxygen coverage to Codecov
if: always()
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/doxygen-coverage-results/new.info
disable_search: true
flags: doxygen-coverage

View File

@@ -1,58 +0,0 @@
name: "CodeQL"
on:
push:
branches:
- main
- v*-branch
- collab-*
schedule:
- cron: '34 16 * * 6'
pull_request:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
jobs:
analyze:
name: Analyze (${{ matrix.language }})
runs-on: ubuntu-24.04
permissions:
security-events: write
strategy:
fail-fast: false
matrix:
include:
- language: python
build-mode: none
- language: actions
build-mode: none
config: ./.github/codeql/codeql-actions-config.yml
- language: javascript-typescript
build-mode: none
config: ./.github/codeql/codeql-js-config.yml
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Initialize CodeQL
uses: github/codeql-action/init@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
queries: security-extended
config-file: ${{ matrix.config }}
- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo "nothing yet"
exit 0
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,64 +0,0 @@
name: Coding Guidelines
on: pull_request
permissions:
contents: read
jobs:
compliance_job:
runs-on: ubuntu-24.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guidelines Checks
continue-on-error: true
id: coding_guidelines
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
source zephyr-env.sh
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/guideline_check.py --output output.txt -c origin/${BASE_REF}..
- name: check-warns
run: |
if [[ -s "output.txt" ]]; then
errors=$(cat output.txt)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=output.txt::$errors"
exit 1;
fi

View File

@@ -1,19 +1,10 @@
name: Compliance Checks
name: Compliance
on:
pull_request:
types:
- edited
- opened
- reopened
- synchronize
permissions:
contents: read
on: pull_request
jobs:
check_compliance:
runs-on: ubuntu-24.04
compliance_job:
runs-on: ubuntu-latest
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
@@ -21,55 +12,36 @@ jobs:
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase onto the target branch
- name: cache-pip
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-doc-pip
- name: Install python dependencies
run: |
pip3 install setuptools
pip3 install wheel
pip3 install python-magic junitparser==1.6.3 gitlint pylint pykwalify
pip3 install west
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
git remote -v
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
# debug
git log --pretty=oneline | head -n 10
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Setup Node.js
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
with:
node-version: "lts/*"
cache: npm
check-latest: true
cache-dependency-path: ./scripts/ci/package-lock.json
- name: Install Node dependencies
run: npm --prefix ./scripts/ci ci
west update
- name: Run Compliance Tests
continue-on-error: true
@@ -81,58 +53,32 @@ jobs:
# debug
ls -la
git log --pretty=oneline | head -n 10
# Increase rename limit to allow for large PRs
git config diff.renameLimit 10000
excludes="-e KconfigBasic -e SysbuildKconfigBasic -e ClangFormat"
# The signed-off-by check for dependabot should be skipped
if [ "${{ github.actor }}" == "dependabot[bot]" ]; then
excludes="$excludes -e Identity"
fi
./scripts/ci/check_compliance.py --annotate $excludes -c origin/${BASE_REF}..
./scripts/ci/check_compliance.py -m Codeowners -m Devicetree -m Gitlint -m Identity -m Nits -m pylint -m checkpatch -m Kconfig -c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
continue-on-error: true
uses: actions/upload-artifact@master
continue-on-error: True
with:
name: compliance.xml
path: compliance.xml
- name: Upload dts linter patch
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
continue-on-error: true
if: hashFiles('dts_linter.patch') != ''
with:
name: dts_linter.patch
path: dts_linter.patch
- name: check-warns
run: |
if [[ ! -s "compliance.xml" ]]; then
exit 1;
fi
warns=("ClangFormat" "LicenseAndCopyrightCheck")
files=($(./scripts/ci/check_compliance.py -l))
for file in "${files[@]}"; do
f="${file}.txt"
if [[ -s $f ]]; then
results=$(cat $f)
results="${results//'%'/'%25'}"
results="${results//$'\n'/'%0A'}"
results="${results//$'\r'/'%0D'}"
if [[ "${warns[@]}" =~ "${file}" ]]; then
echo "::warning file=${f}::$results"
else
echo "::error file=${f}::$results"
exit=1
fi
for file in Nits.txt checkpatch.txt Identity.txt Gitlint.txt pylint.txt Devicetree.txt Kconfig.txt Codeowners.txt; do
if [[ -s $file ]]; then
errors=$(cat $file)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=${file}::$errors"
exit=1
fi
done
if [ "${exit}" == "1" ]; then
echo "Compliance error, check for error messages in the \"Run Compliance Tests\" step"
echo "You can run this step locally with the ./scripts/ci/check_compliance.py script."
if [ ${exit} == 1 ]; then
exit 1;
fi

14
.github/workflows/conflict.yml vendored Normal file
View File

@@ -0,0 +1,14 @@
name: Conflict Finder
on:
push:
branches-ignore:
- '**'
jobs:
conflict:
runs-on: ubuntu-latest
steps:
- uses: mschilde/auto-label-merge-conflicts@master
with:
CONFLICT_LABEL_NAME: "has conflicts"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -5,43 +5,33 @@ name: Publish commit for daily testing
on:
schedule:
- cron: '50 22 * * *'
- cron: '50 22 * * *'
push:
branches:
- refs/tags/*
permissions:
contents: read
- refs/tags/*
jobs:
get_version:
runs-on: ubuntu-24.04
runs-on: ubuntu-latest
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID_TESTING }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY_TESTING }}
aws-region: us-east-1
- name: install-pip
run: |
pip3 install gitpython
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .

View File

@@ -6,47 +6,59 @@ name: Devicetree script tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
permissions:
contents: read
jobs:
devicetree-checks:
name: Devicetree script tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
python-version: [3.6, 3.7, 3.8]
os: [ubuntu-latest, macos-latest, windows-latest]
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v1
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v1
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: run tox
working-directory: scripts/dts/python-devicetree
pip3 install wheel
pip3 install pytest pyyaml
- name: run pytest
working-directory: scripts/dts
run: |
tox
python -m pytest testdtlib.py testedtlib.py

View File

@@ -1,279 +1,79 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Documentation Build
name: Documentation GitHub Workflow
on:
push:
branches:
- main
tags:
- v*
pull_request:
permissions:
contents: read
env:
DOXYGEN_VERSION: 1.15.0
DOXYGEN_SHA256SUM: 0ec2e5b2c3cd82b7106d19cb42d8466450730b8cb7a9e85af712be38bf4523a1
JOB_COUNT: 8
paths:
- 'Makefile'
- 'doc/**'
- '**.rst'
- 'include/**'
- 'kernel/include/kernel_arch_interface.h'
- 'lib/libc/**'
- 'subsys/testsuite/ztest/include/**'
- 'tests/**'
- '.known-issues/doc/**'
- '**/Kconfig*'
- 'west.yml'
- '.github/workflows/doc-build.yml'
jobs:
doc-file-check:
name: Check for doc changes
runs-on: ubuntu-24.04
outputs:
file_check: ${{ steps.check-doc-files.outputs.any_modified }}
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Check if Documentation related files changed
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
id: check-doc-files
with:
files: |
doc/
boards/**/doc/
**.rst
include/
kernel/include/kernel_arch_interface.h
lib/libc/**
subsys/testsuite/ztest/include/**
**/Kconfig*
west.yml
scripts/dts/
doc/requirements.txt
.github/workflows/doc-build.yml
scripts/pylib/pytest-twister-harness/src/twister_harness/device/device_adapter.py
scripts/pylib/pytest-twister-harness/src/twister_harness/helpers/shell.py
doc-build-html:
name: "Documentation Build (HTML)"
needs: [doc-file-check]
if: >
needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
timeout-minutes: 60
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
env:
BASE_REF: ${{ github.base_ref }}
doc-build:
name: "Documentation Build"
runs-on: ubuntu-latest
steps:
- name: Print cloud service information
- name: Update PATH for west
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages required for documentation build
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip install -r doc/requirements.txt --require-hashes
- name: Build HTML documentation
shell: bash
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
DOC_TARGET="html-fast"
else
DOC_TARGET="html"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-j ${JOB_COUNT} -W --keep-going -T" \
SPHINXOPTS_EXTRA="-q -t publish" \
make -C doc ${DOC_TARGET}
# API documentation coverage
python3 -m coverxygen --xml-dir doc/_build/html/doxygen/xml/ --src-dir include/ --output doc-coverage.info
# deprecated page causing issues
lcov --remove doc-coverage.info \*/deprecated > new.info
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: Compress documentation build artifacts
run: |
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --exclude html/_sources --exclude html/doxygen/xml --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf api-output.tar.xz --directory=doc/_build html/doxygen/html
tar --use-compress-program="xz -T0" -cf api-coverage.tar.xz coverage-report
- name: Upload HTML output
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: html-output
path: html-output.tar.xz
- name: Upload Doxygen coverage artifacts
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: api-coverage
path: api-coverage.tar.xz
- name: Summarize PR documentation URLs
if: github.event_name == 'pull_request'
run: |
REPO_NAME="${{ github.event.repository.name }}"
PR_NUM="${{ github.event.pull_request.number }}"
DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/"
API_DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/doxygen/html/"
API_COVERAGE_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/api-coverage/"
echo "${PR_NUM}" > pr_num
echo "Documentation will be available shortly at: ${DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Documentation will be available shortly at: ${API_DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: Upload PR number
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: github.event_name == 'pull_request'
with:
name: pr_num
path: pr_num
doc-build-pdf:
name: "Documentation Build (PDF)"
needs: [doc-file-check]
if: |
github.event_name != 'pull_request'
runs-on: ubuntu-24.04
timeout-minutes: 120
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: zephyr
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: doc/requirements.txt
uses: actions/checkout@v2
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends graphviz librsvg2-bin \
texlive-latex-base texlive-latex-extra latexmk \
texlive-fonts-recommended texlive-fonts-extra texlive-xetex \
imagemagick fonts-noto xindy
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
echo "${DOXYGEN_SHA256SUM} doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz" | sha256sum -c
if [ $? -ne 0 ]; then
echo "Failed to verify doxygen tarball"
exit 1
fi
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
sudo apt-get install -y ninja-build doxygen
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@360ff9b36e58499d9eb28015cdcde7ca03a5b04d # v1.0.12
- name: cache-pip
uses: actions/cache@v1
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
enable-ccache: false
path: ~/.cache/pip
key: ${{ runner.os }}-doc-pip
- name: install-pip-pkgs
working-directory: zephyr
- name: install-pip
run: |
pip install -r doc/requirements.txt --require-hashes
pip3 install setuptools wheel
pip3 install -r scripts/requirements-base.txt
pip3 install -r scripts/requirements-doc.txt
pip3 install west==0.9.0
- name: west setup
run: |
west init -l . || true
- name: build-docs
shell: bash
working-directory: zephyr
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-q -j ${JOB_COUNT}" \
LATEXMKOPTS="-quiet -halt-on-error" \
make -C doc pdf
source zephyr-env.sh
make htmldocs
tar cvf htmldocs.tar --directory=./doc/_build html
- name: upload-build
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@master
continue-on-error: True
with:
name: pdf-output
if-no-files-found: ignore
path: |
zephyr/doc/_build/latex/zephyr.pdf
zephyr/doc/_build/latex/zephyr.log
name: htmldocs.tar
path: htmldocs.tar
doc-build-status-check:
if: always()
name: "Documentation Build Status"
needs:
- doc-build-pdf
- doc-file-check
- doc-build-html
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}
- name: check-warns
run: |
if [ -s doc/_build/doc.warnings ]; then
docwarn=$(cat doc/_build/doc.warnings)
docwarn="${docwarn//'%'/'%25'}"
docwarn="${docwarn//$'\n'/'%0A'}"
docwarn="${docwarn//$'\r'/'%0D'}"
# We treat doc warnings as errors
echo "::error file=doc.warnings::$docwarn"
exit 1
fi

View File

@@ -1,87 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish (Pull Request)
on:
workflow_run:
workflows: ["Documentation Build"]
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Load PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
let fs = require("fs");
let pr_number = Number(fs.readFileSync("./pr_num/pr_num"));
core.exportVariable("PR_NUM", pr_number);
- name: Check PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
id: check-pr
uses: carpentries/actions/check-valid-pr@083bb9952b1414bd2b9e10ecec1717c938aba4c5 # v0.17.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
- name: Validate PR number
if: |
steps.download-artifacts.outputs.found_artifact == 'true' &&
steps.check-pr.outputs.VALID != 'true'
run: |
echo "ABORT: PR number validation failed!"
exit 1
- name: Uncompress HTML docs
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
if: steps.download-artifacts.outputs.found_artifact == 'true'
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
aws s3 sync --quiet html-output/html \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/docs \
--delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/api-coverage \
--delete
fi

View File

@@ -1,64 +1,116 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish
name: Doc build for Release or Daily
# Either a daily based on schedule/cron or only on tag push
on:
workflow_run:
workflows: ["Documentation Build"]
branches:
- main
- v*
types:
- completed
permissions:
contents: read
schedule:
- cron: '50 22 * * *'
push:
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-latest
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Uncompress HTML docs
- name: Update PATH for west
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Determine tag
id: tag
run: |
# We expect to get here either due to a schedule event in which
# case we are doing a daily build of the docs, or because a new
# tag was pushed, in which case we are building docs for a release
if [ ${GITHUB_EVENT_NAME} == "schedule" ]; then
echo ::set-output name=TYPE::daily;
echo ::set-output name=RELEASE::latest;
elif [ ${GITHUB_EVENT_NAME} == "push" ]; then
# If push due to a tag GITHUB_REF will look like refs/tags/TAG-FOO
# chop of 'refs/tags' so RELEASE=TAG-FOO
echo ::set-output name=TYPE::release;
echo ::set-output name=RELEASE::${GITHUB_REF/refs\/tags\//};
else
exit 1
fi
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: checkout
uses: actions/checkout@v2
- name: install-pkgs
run: |
sudo apt-get install -y ninja-build doxygen
- name: cache-pip
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-doc-pip
- name: install-pip
run: |
pip3 install setuptools
pip3 install -r scripts/requirements-base.txt
pip3 install -r scripts/requirements-doc.txt
- name: west setup
run: |
west init -l . || true
- name: build-docs
env:
DOC_TAG: ${{ steps.tag.outputs.TYPE }}
run: |
source zephyr-env.sh
make DOC_TAG=${DOC_TAG} htmldocs
- name: check-warns
run: |
if [ -s doc/_build/doc.warnings ]; then
docwarn=$(cat doc/_build/doc.warnings)
docwarn="${docwarn//'%'/'%25'}"
docwarn="${docwarn//$'\n'/'%0A'}"
docwarn="${docwarn//$'\r'/'%0D'}"
# We treat doc warnings as errors
echo "::error file=doc.warnings::$docwarn"
exit 1
fi
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
RELEASE: ${{ steps.tag.outputs.RELEASE }}
run: |
if [ "${HEAD_BRANCH:0:1}" == "v" ]; then
VERSION=${HEAD_BRANCH:1}
echo "DOC_RELEASE=[$RELEASE]"
if [ "$RELEASE" == "latest" ]; then
export
echo "publish latest docs"
aws s3 sync --quiet doc/_build/html s3://docs.zephyrproject.org/latest --delete
echo "success sync of latest docs"
else
VERSION="latest"
# we want just the version, without the leading 'v'
DOC_RELEASE=${RELEASE:1}
echo "publish release docs: ${DOC_RELEASE}"
aws s3 sync --quiet doc/_build/html s3://docs.zephyrproject.org/${DOC_RELEASE}
echo "success sync of rel docs"
fi
aws s3 sync --quiet html-output/html s3://docs.zephyrproject.org/${VERSION} --delete
aws s3 sync --quiet html-output/html/doxygen/html s3://docs.zephyrproject.org/apidoc/${VERSION} --delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ s3://docs.zephyrproject.org/api-coverage/${VERSION} --delete
if [ -d doc/_build/doxygen/html ]; then
API_RELEASE=${RELEASE:1}
echo "publish doxygen to apidoc/${API_RELEASE}"
aws s3 sync --quiet doc/_build/doxygen/html s3://docs.zephyrproject.org/apidoc/${API_RELEASE} --delete
echo "success publish of doxygen"
fi
aws s3 cp --quiet pdf-output/zephyr.pdf s3://docs.zephyrproject.org/${VERSION}/zephyr.pdf

View File

@@ -1,36 +0,0 @@
name: Error numbers
on:
pull_request:
paths:
- '.github/workflows/errno.yml'
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
- 'SDK_VERSION'
permissions:
contents: read
jobs:
check-errno:
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: zephyr
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@360ff9b36e58499d9eb28015cdcde7ca03a5b04d # v1.0.12
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
west-group-filter: -hal,-tools,-bootloader,-babblesim
west-project-filter: -nrf_hw_models
enable-ccache: false
- name: Run errno.py
working-directory: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
export ZEPHYR_BASE=${PWD}
./scripts/ci/errno.py

View File

@@ -1,138 +0,0 @@
name: Footprint Tracking
on:
schedule:
- cron: '50 00 * * *'
push:
paths:
- 'VERSION'
- '.github/workflows/footprint-tracking.yml'
branches:
- main
- v*-branch
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
footprint-tracking:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
defaults:
run:
shell: bash
strategy:
fail-fast: false
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install packages
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: west setup
run: |
west init -l . || true
west config --global update.narrow true
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Record Footprint
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=${PWD}
./scripts/footprint/track.py -p scripts/footprint/plan.txt
- name: Upload footprint data
run: |
python3 -m venv .venv
. .venv/bin/activate
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/
- name: Transform Footprint data to Twister JSON reports
run: |
shopt -s globstar
export ZEPHYR_BASE=${PWD}
python3 ./scripts/footprint/pack_as_twister.py -vvv \
--plan ./scripts/footprint/plan.txt \
--test-name='name.feature' \
./footprint_data/**/
- name: Upload to ElasticSearch
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
ELASTICSEARCH_INDEX: ${{ vars.FOOTPRINT_TRACKING_INDEX }}
run: |
shopt -s globstar
run_date=`date --iso-8601=minutes`
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--flatten footprint \
--flatten-list-names "{'children':'name'}" \
--transform "{ 'footprint_name': '^(?P<footprint_area>([^\/]+\/){0,2})(?P<footprint_path>([^\/]*\/)*)(?P<footprint_symbol>[^\/]*)$' }" \
--run-id "${{ github.run_id }}" \
--run-attempt "${{ github.run_attempt }}" \
--run-workflow "footprint-tracking:${{ github.event_name }}" \
--run-branch "${{ github.ref_name }}" \
-i ${ELASTICSEARCH_INDEX} \
./footprint_data/**/twister_footprint.json
#

View File

@@ -1,64 +0,0 @@
name: Greet first time contributor
on:
issues:
types: [opened]
pull_request_target:
types: [opened, closed]
permissions:
contents: read
jobs:
check_for_first_interaction:
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on pull requests
issues: write # to comment on issues
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- uses: zephyrproject-rtos/action-first-interaction@58853996b1ac504b8e0f6964301f369d2bb22e5c # v1.1.1+zephyr.6
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-message: >
Hi @${{github.event.issue.user.login}}! We appreciate you submitting your first issue
for our open-source project. 🌟
Even though I'm a bot, I can assure you that the whole community is genuinely grateful
for your time and effort. 🤖💙
pr-opened-message: >
Hello @${{ github.event.pull_request.user.login }}, and thank you very much for your
first pull request to the Zephyr project!
Our Continuous Integration pipeline will execute a series of checks on your Pull Request
commit messages and code, and you are expected to address any failures by updating the PR.
Please take a look at [our commit message guidelines](https://docs.zephyrproject.org/latest/contribute/guidelines.html#commit-message-guidelines)
to find out how to format your commit messages, and at [our contribution workflow](https://docs.zephyrproject.org/latest/contribute/guidelines.html#contribution-workflow)
to understand how to update your Pull Request.
If you haven't already, please make sure to review the project's [Contributor
Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html)
and update (by amending and force-pushing the commits) your pull request if necessary.
If you are stuck or need help please join us on [Discord](https://chat.zephyrproject.org/)
and ask your question there. Additionally, you can [escalate the review](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html#pr-technical-escalation)
when applicable. 😊
pr-merged-message: >
Hi @${{ github.event.pull_request.user.login }}!
Congratulations on getting your very first Zephyr pull request merged 🎉🥳. This is a
fantastic achievement, and we're thrilled to have you as part of our community!
To celebrate this milestone and showcase your contribution, we'd love to award you the
Zephyr Technical Contributor badge. If you're interested, please claim your badge by
filling out this form: [Claim Your Zephyr Badge](https://forms.gle/oCw9iAPLhUsHTapc8).
Thank you for your valuable input, and we look forward to seeing more of your
contributions in the future! 🪁

View File

@@ -1,96 +0,0 @@
name: Hello World (Multiplatform)
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/**'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04, ubuntu-24.04, ubuntu-24.04-arm, macos-14, windows-2022]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: zephyr
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
shell: bash
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@360ff9b36e58499d9eb28015cdcde7ca03a5b04d # v1.0.12
with:
app-path: zephyr
toolchains: arm-zephyr-eabi:riscv64-zephyr-elf
ccache-cache-key: hw-${{ matrix.os }}
- name: Build firmware
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
elif [ "${{ runner.os }}-${{ runner.arch }}" == "Linux-ARM64" ]; then
EXTRA_TWISTER_FLAGS="--exclude-platform native_sim/native"
fi
west twister \
-p native_sim -p qemu_cortex_m0 -p qemu_riscv32 -p qemu_riscv64 \
--runtime-artifact-cleanup \
--force-color \
--inline-logs \
-T samples/hello_world \
-T samples/cpp/hello_world \
-v \
$EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
if-no-files-found: ignore
path:
zephyr/twister-out/*/samples/hello_world/sample.basic.helloworld/build.log
zephyr/twister-out/*/samples/cpp/hello_world/sample.cpp.helloworld/build.log

View File

@@ -1,11 +1,8 @@
name: Issue Tracker
on:
schedule:
- cron: '10 1/4 * * *'
permissions:
contents: read
issues:
types: [opened, labeled, closed]
env:
OUTPUT_FILE_NAME: IssuesReport.md
@@ -17,20 +14,18 @@ env:
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-latest
steps:
- name: Download configuration file
run: |
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/main/.github/workflows/issues-report-config.json
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/master/.github/workflows/issues-report-config.json
- name: install-packages
run: |
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@54c549b7d38b7db39e5c6e06fd9617e12e5c3491 # v4
- uses: brcrista/summarize-issues@v3
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
@@ -38,20 +33,21 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
continue-on-error: true
uses: actions/upload-artifact@master
continue-on-error: True
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708 # v5.1.1
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID_TESTING }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY_TESTING }}
aws-region: us-east-1
- name: Post Results
run: |
mkd2html IssuesReport.md IssuesReport.html
aws s3 cp --quiet IssuesReport.html s3://testing.zephyrproject.org/issues/$GITHUB_REPOSITORY/index.html

12
.github/workflows/labeler.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
name: 'Pull Request Labeler'
on:
- pull_request_target
jobs:
labeler:
name: Pull Request Labeler
runs-on: ubuntu-latest
steps:
- uses: actions/labeler@v2.1.1
with:
repo-token: '${{ secrets.GITHUB_TOKEN }}'

View File

@@ -2,25 +2,20 @@ name: Scancode
on: [pull_request]
permissions:
contents: read
jobs:
scancode_job:
runs-on: ubuntu-24.04
runs-on: ubuntu-latest
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
uses: actions/checkout@v1
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@23ef91ce31cd4b954366a7b71eea47520da9b380 # v4
uses: zephyrproject-rtos/action_scancode@v3
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@v1
with:
name: scancode
path: ./artifacts

View File

@@ -1,45 +1,29 @@
name: Manifest
on:
pull_request_target:
permissions:
contents: read
paths:
- 'west.yml'
jobs:
contribs:
runs-on: ubuntu-24.04
permissions:
pull-requests: write # to create/update pull request comments
runs-on: ubuntu-latest
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
with:
path: zephyrproject/zephyr
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip install -r scripts/requirements-west.txt --require-hashes
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@09983f53d3d878791aa37a7755ae44d695f4c1e5 # v2.0.0
uses: zephyrproject-rtos/action-manifest@main
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
check-impostor-commits: 'true'
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM (manifest)'
blobs-added-labels: 'Binary Blobs Added'
blobs-modified-labels: 'Binary Blobs Modified'
labels: 'manifest, west'
dnm-labels: 'DNM'

View File

@@ -1,19 +0,0 @@
name: Check SHA-pinned GitHub Actions
on:
pull_request:
paths:
- '.github/workflows/**'
permissions:
contents: read
jobs:
check-sha-pinned-actions:
name: Verify GitHub Actions
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Ensure SHA pinned actions
uses: zgosalvez/github-actions-ensure-sha-pinned-actions@6124774845927d14c601359ab8138699fa5b70c3 # v4.0.1

View File

@@ -1,46 +0,0 @@
name: PR Metadata Check
on:
pull_request:
types:
- synchronize
- opened
- reopened
- labeled
- unlabeled
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
do-not-merge:
name: Prevent Merging
runs-on: ubuntu-24.04
timeout-minutes: 30
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run the check script
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
python -u scripts/ci/do_not_merge.py \
-p "${{ github.event.pull_request.number }}" \
-o "${{ github.event.repository.owner.login }}" \
-r "${{ github.event.repository.name }}"

View File

@@ -1,53 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Misc. Pylib Scripts TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
permissions:
contents: read
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run build_helpers tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/build_helpers

View File

@@ -1,31 +0,0 @@
name: ready to merge
on:
workflow_call:
inputs:
needs_context:
type: string
required: true
permissions:
contents: read
jobs:
all_jobs_passed:
name: all jobs passed
runs-on: ubuntu-latest
steps:
- name: "Check status of all required jobs"
run: |-
NEEDS_CONTEXT='${{ inputs.needs_context }}'
JOB_IDS=$(echo "$NEEDS_CONTEXT" | jq -r 'keys[]')
for JOB_ID in $JOB_IDS; do
RESULT=$(echo "$NEEDS_CONTEXT" | jq -r ".[\"$JOB_ID\"].result")
echo "$JOB_ID job result: $RESULT"
if [[ $RESULT != "success" && $RESULT != "skipped" ]]; then
echo "***"
echo "Error: The $JOB_ID job did not pass."
exit 1
fi
done
echo "All jobs passed or were skipped."

View File

@@ -4,58 +4,54 @@ on:
push:
tags:
- 'v*'
- '!v*rc*'
permissions:
contents: read
jobs:
release:
runs-on: ubuntu-24.04
permissions:
contents: write # to create GitHub release entry
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Get the version
id: get_version
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
run: echo ::set-output name=VERSION::${GITHUB_REF#refs/tags/}
- name: REUSE Compliance Check
uses: fsfe/reuse-action@676e2d560c9a403aa252096d99fcab3e1132b0f5 # v6.0.0
uses: fsfe/reuse-action@v1
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
continue-on-error: true
uses: actions/upload-artifact@master
continue-on-error: True
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: Create empty release notes body
- name: Get Diff since last tag
run: |
echo "TODO: add release overview and notes link" > release-notes.txt
oldtag=$(git describe --abbrev=0 ${{ github.ref }}^)
echo "Changes since ${oldtag}:" > release-notes.txt
echo "" >> release-notes.txt
echo "" >> release-notes.txt
git shortlog ${oldtag}..${{ github.ref }} >> release-notes.txt
- name: Create Release
id: create_release
uses: actions/create-release@0cb9c9b65d5d1901c1f53e5e66eaf4afd303e70e # v1.1.4
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Zephyr ${{ steps.get_version.outputs.TRIMMED_VERSION }}
release_name: Zephyr ${{ github.ref }}
body_path: release-notes.txt
draft: true
prerelease: true
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@e8f9f06c4b078e705bd2ea027f0926603fc9b4d5 # v1.0.2
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -1,61 +0,0 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.
name: Scorecards supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '43 7 * * 6'
push:
branches:
- main
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed for Code scanning upload
security-events: write
# Needed for GitHub OIDC token if publish_results is true
id-token: write
steps:
- name: "Checkout code"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@4eaacf0543bb3f2c246792bd56e8cdeffafb205a # v2.4.3
with:
results_file: results.sarif
results_format: sarif
# Publish results to OpenSSF REST API for easy access by consumers.
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
publish_results: true
# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: SARIF file
path: results.sarif
retention-days: 5
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
with:
sarif_file: results.sarif

View File

@@ -1,70 +0,0 @@
# Copyright 2023 Google LLC
# SPDX-License-Identifier: Apache-2.0
name: Scripts tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
permissions:
contents: read
jobs:
scripts-tests:
name: Scripts tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest
env:
ZEPHYR_BASE: ./
run: |
echo "Run script tests"
pytest ./scripts/build

View File

@@ -1,33 +0,0 @@
name: Stale Workflow Queue Cleanup
on:
workflow_dispatch:
schedule:
# everyday at 15:00
- cron: '0 15 * * *'
permissions:
contents: read
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-24.04
if: github.ref == 'refs/heads/main'
permissions:
actions: write # to delete stale workflow runs
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@78b5af714fefaefdf74862181c467b061782719e # v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day
older-than-seconds: 86400
status: queued
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -3,33 +3,21 @@ on:
schedule:
- cron: "16 00 * * *"
permissions:
contents: read
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-24.04
runs-on: ubuntu-latest
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on stale pull requests
issues: write # to comment on stale issues
steps:
- uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10.1.1
- uses: actions/stale@v3
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this pull request will automatically be
closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this issue will automatically be closed in
14 days. Note, that you can always re-open a closed issue at any time.'
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-pr-message: 'This pull request has been marked as stale because it has been open (more than) 60 days with no activity. Remove the stale label or add a comment saying that you would like to have the label removed otherwise this pull request will automatically be closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more than) 60 days with no activity. Remove the stale label or add a comment saying that you would like to have the label removed otherwise this issue will automatically be closed in 14 days. Note, that you can always re-open a closed issue at any time.'
days-before-stale: 60
days-before-close: 14
stale-issue-label: 'Stale'
stale-pr-label: 'Stale'
exempt-pr-labels: 'Blocked,In progress'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta,Process,Coverity'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta'
operations-per-run: 400

View File

@@ -1,38 +0,0 @@
name: Merged PR stats
on:
pull_request_target:
branches:
- main
- v*-branch
types: [closed]
permissions:
contents: read
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -1,64 +0,0 @@
name: Publish Twister Test Results
on:
workflow_run:
workflows: ["Run tests with twister"]
branches:
- main
types:
- completed
permissions:
contents: read
jobs:
upload-to-elasticsearch:
if: |
github.repository == 'zephyrproject-rtos/zephyr' &&
github.event.workflow_run.event != 'pull_request'
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
runs-on: ubuntu-24.04
steps:
# Needed for elasticearch and upload script
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
with:
path: artifacts
workflow: twister.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Upload to elasticsearch
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event.workflow_run.event}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event.workflow_run.event}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi

View File

@@ -1,402 +0,0 @@
name: Run tests with twister
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 02:00 UTC on every Sunday
- cron: '0 2 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
twister-build-prep:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-24.04
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Checkout
if: github.event_name == 'pull_request'
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
persist-credentials: false
- name: Set up Python
if: github.event_name == 'pull_request'
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: install-packages
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Setup Zephyr project
if: github.event_name == 'pull_request'
uses: zephyrproject-rtos/action-zephyr-setup@360ff9b36e58499d9eb28015cdcde7ca03a5b04d # v1.0.12
with:
app-path: zephyr
enable-ccache: false
- name: Environment Setup
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
- name: Generate Test Plan with Twister
working-directory: zephyr
if: github.event_name == 'pull_request'
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
echo "TWISTER_NODES=${MATRIX_SIZE}" >> $GITHUB_ENV
fi
rm -f testplan.json .testplan
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "pull_request" ]; then
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${WEEKLY_MATRIX_SIZE})]"
size=${WEEKLY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
twister-build:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: ${{fromJSON(needs.twister-build-prep.outputs.subset)}}
timeout-minutes: 1440
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resolve the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: ' --test-config tests/test_config_ci.yaml --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
WEEKLY_OPTIONS: ' -M --build-only --all --show-footprint --report-filtered -j 32'
PR_OPTIONS: ' --clobber-output --integration -j 16'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered -j 16'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config manifest.group-filter -- +ci,+optional,+testing
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
gcc --version
cargo --version
rustup target list --installed
ls -la
echo "github.ref: ${{ github.ref }}"
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
west packages pip --install
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
id: run_twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${PUSH_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PUSH_OPTIONS}
fi
fi
- if: github.event_name == 'pull_request'
name: Run Tests with Twister (Pull Request)
id: run_twister_pr
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PR_OPTIONS}
fi
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Weekly)
id: run_twister_sched
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
fi
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
path: |
twister-out/twister.xml
twister-out/twister.json
module_tests/twister.xml
testplan.json
- if: matrix.subset == 1 && github.event_name == 'push'
name: Save the list of Python packages
shell: bash
run: |
FREEZE_FILE="frozen-requirements.txt"
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Frozen PIP package set
path: |
frozen-requirements.txt
twister-test-results:
name: "Publish Unit Tests Results"
needs:
- twister-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create the check run entry with Twister test results
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: Check out source code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
path: artifacts
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Unit Test Results
if-no-files-found: ignore
path: |
junit.html
junit.xml
- name: Upload test results to Codecov
if: ${{ !cancelled() && (github.event_name == 'push') }}
uses: codecov/test-results-action@0fa95f0e1eeaafde2c782583b36b28ad0d8c77d3 # v1.2.1
with:
token: ${{ secrets.CODECOV_TOKEN }}
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@27d65e188ec43221b20d26de30f4892fad91df2f # v2.22.0
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off
- name: Analyze Twister Reports
if: needs.twister-build.result == 'failure'
run: |
./scripts/ci/twister_report_analyzer.py artifacts/*/*/twister.json --long-summary --platforms --output-md errors.md
if [[ -s "errors.md" ]]; then
echo '### Error Summary! 🚀' >> $GITHUB_STEP_SUMMARY
cat errors.md >> $GITHUB_STEP_SUMMARY
fi
- name: Upload Twister Analysis Results
if: needs.twister-build.result == 'failure'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: Twister Analysis Results
if-no-files-found: ignore
path: |
twister_report_summary.json
twister-status-check:
if: always()
name: "Check Twister Status"
needs:
- twister-build-prep
- twister-build
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

View File

@@ -5,29 +5,17 @@ name: Twister TestSuite
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/pylib/**'
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
permissions:
contents: read
jobs:
twister-tests:
@@ -35,35 +23,30 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
python-version: [3.6, 3.7, 3.8]
os: [ubuntu-latest]
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for twisterlib
pip3 install pytest colorama pyyaml ply mock
- name: Run pytest
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/twister
- name: Run pytest for pytest-twister-harness
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
PYTHONPATH: ./scripts/pylib/pytest-twister-harness/src:${PYTHONPATH}
run: |
echo "Run twister tests"
pytest ./scripts/pylib/pytest-twister-harness/tests

View File

@@ -1,66 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister BlackBox TestSuite
on:
pull_request:
branches:
- main
- collab-*
- v*-branch
paths:
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
permissions:
contents: read
env:
PYTHONIOENCODING: utf-8
jobs:
twister-tests:
name: Twister Black Box Tests
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
fail-fast: false
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: zephyr
fetch-depth: 0
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@360ff9b36e58499d9eb28015cdcde7ca03a5b04d # v1.0.12
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi:riscv64-zephyr-elf:x86_64-zephyr-elf'
enable-ccache: false
west-group-filter: -tools,-bootloader,-babblesim,-hal
west-project-filter: -nrf_hw_models,+cmsis,+hal_xtensa,+cmsis_6
- name: Run Pytest For Twister Black Box Tests
working-directory: zephyr
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
sudo apt-get install -y lcov
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox/

View File

@@ -5,55 +5,64 @@ name: Zephyr West Command Tests
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
permissions:
contents: read
jobs:
west-commands:
west-commnads:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
python-version: [3.6, 3.7, 3.8]
os: [ubuntu-latest, macos-latest, windows-latest]
steps:
- name: checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v1
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v1
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install pytest
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install wheel
pip3 install pytest west pyelftools canopen progress mypy intelhex psutil
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |

75
.gitignore vendored
View File

@@ -1,6 +1,3 @@
# SPDX-License-Identifier: Apache-2.0
# SPDX-FileCopyrightText: Copyright The Zephyr Project Contributors
*.o
*.a
*.d
@@ -10,16 +7,9 @@
*.swp
*.swo
*~
# Emacs
\#*\#
build*/
!doc/build/
!scripts/build
!share/sysbuild/build
!doc/guides/build
!tests/drivers/build_all
!scripts/pylib/build_helpers
cscope.*
.dir
@@ -33,8 +23,6 @@ outdir
outdir-*
scripts/basic/fixdep
scripts/gen_idt/gen_idt
coverage-report
doc-coverage.info
doc/_build
doc/doxygen
doc/xml
@@ -45,10 +33,7 @@ doc/latex
doc/themes/zephyr-docs-theme
sanity-out*
twister-out*
bsim_out
bsim_bt_out
myresults.xml
tests/RunResults.xml
scripts/grub
doc/reference/kconfig/*.rst
doc/doc.warnings
@@ -57,28 +42,6 @@ doc/doc.warnings
.envrc
.vscode
hide-defaults-note
venv
.venv
.envrc
.DS_Store
.clangd
new.info
# Cargo drops lock files in projects to capture resolved dependencies.
# We don't want to record these.
Cargo.lock
# Cargo encourages a .cargo/config.toml file to symlink to a generated file. Don't save these.
.cargo/
# Normal west builds will place the Rust target directory under the build directory. However,
# sometimes IDEs and such will litter these target directories as well.
target/
# CI output
compliance.xml
dts_linter.patch
_error.types
# Tag files
GPATH
@@ -88,39 +51,3 @@ TAGS
tags
.idea
# from check_compliance.py
# zephyr-keep-sorted-start
BinaryFiles.txt
BoardYml.txt
Checkpatch.txt
ClangFormat.txt
DevicetreeBindings.txt
DevicetreeLinting.txt
GitDiffCheck.txt
Gitlint.txt
Identity.txt
ImageSize.txt
Kconfig.txt
KconfigBasic.txt
KconfigBasicNoModules.txt
KconfigHWMv2.txt
KeepSorted.txt
LicenseAndCopyrightCheck.txt
MaintainersFormat.txt
ModulesMaintainers.txt
Nits.txt
Pylint.txt
PythonCompat.txt
Ruff.txt
SphinxLint.txt
SysbuildKconfig.txt
SysbuildKconfigBasic.txt
SysbuildKconfigBasicNoModules.txt
TextEncoding.txt
YAMLLint.txt
ZephyrModuleFile.txt
# zephyr-keep-sorted-stop
# Node dependecies
node_modules

View File

@@ -1,15 +1,10 @@
# All these sections are optional, edit this file as you like.
# Zephyr-specific defaults are located in scripts/gitlint/zephyr_commit_rules.py
[general]
regex-style-search=true
ignore=title-trailing-punctuation, T3, title-max-length, T1, body-hard-tab, B3, B1
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
verbosity = 3
# By default gitlint will ignore merge commits. Set to 'false' to disable.
ignore-merge-commits=false
ignore-revert-commits=false
ignore-fixup-commits=false
ignore-squash-commits=false
ignore-merge-commits=true
# Enable debug mode (prints more output). Disabled by default
debug = false
@@ -18,16 +13,16 @@ debug = false
extra-path=scripts/gitlint
[title-max-length-no-revert]
# line-length=75
line-length=75
[body-min-line-count]
# min-line-count=1
min-line-count=1
[body-max-line-count]
# max-line-count=200
max-line-count=200
[title-starts-with-subsystem]
regex = ^(?!subsys:)(?!treewide:)(([^:]+):)(\s([^:]+):)*\s(.+)$
regex = ^(?!subsys:)(([^:]+):)(\s([^:]+):)*\s(.+)$
[title-must-not-contain-word]
# Comma-separated list of words that should not occur in the title. Matching is case
@@ -44,7 +39,7 @@ words=wip
[max-line-length-with-exceptions]
# B1 = body-max-line-length
# line-length=75
line-length=72
[body-min-length]
min-length=3
@@ -60,7 +55,3 @@ ignore-merge-commits=false
# By specifying this rule, developers can only change the file when they explicitly reference
# it in the commit message.
#files=gitlint/rules.py,README.md
[ignore-by-author-name]
regex=^dependabot\[bot\]$
ignore=all

55
.known-issues/README Normal file
View File

@@ -0,0 +1,55 @@
This directory contains configuration files to ignore errors found in
the build and test process which are known to the developers and for
now can be safely ignored.
To use:
$ cd zephyr
$ make SOMETHING >& result
$ scripts/filter-known-issues.py result
It is included in the source tree so if anyone has to submit anything
that triggers some kind of error that is a false positive, it can
include the "ignore me" file, properly documented.
Each file can contain one or more multiline Python regular expressions
(https://docs.python.org/2/library/re.html#regular-expression-syntax)
that match an error message. Multiple regular expressions are
separated by comment blocks (that start with #). Note that an empty
line still is considered part of the multiline regular expression.
For example
---beginning---
#
# This testcase always fails, pending fix ZEP-1234
#
.*/tests/kernel/grumpy .* FAIL
#
# Documentation issue, masks:
#
# /home/e/inaky/z/kernel.git/doc/api/io_interfaces.rst:28: WARNING: Invalid definition: Expected identifier in nested name. [error at 19]
# struct dev_config::@65 dev_config::bits
# -------------------^
#
^(?P<filename>.+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^\s+struct dev_config::@[0-9]+ dev_config::bits.*
^\s+-+\^
---end---
Note you want to:
- use relateive paths; instead of
/home/me/mydir/zephyr/something/somewhere.c you will want
^.*/something/somewhere.c (as they will depend on where it is being
built)
- Replace line numbers with [0-9]+, as they will change
- (?P<filename>[-._/\w]+/something/somewhere.c) saves the match on
that file path in a "variable" called 'filename' that later you can
match with (?P=filename) if you want to match multiple lines of the
same error message.
Can get really twisted and interesting in terms of regexps; they are
powerful, so start small :)

View File

@@ -0,0 +1,39 @@
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]file_system[/\\]index.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]peripherals[/\\]dma.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]audio[/\\]dmic.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]networking[/\\]net_if.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]networking[/\\]ieee802154.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]networking[/\\]sockets.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]uuid.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]sdp.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]rfcomm.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]hci_raw.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]hci_drivers.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]gatt.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]gap.rst):(?P<lineno>[0-9]+): WARNING: Duplicate C declaration, also defined at (.*)\.$
^Declaration is \'.*\'\.$

View File

@@ -0,0 +1,14 @@
# multiple section 'index'
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]custom-doxygen[/\\]mainpage.md):[0-9]+: warning: multiple use of section label 'index' for main page, \(first occurrence: .*$
#
^(?P<filename>([\-:\\/\w\.])+[/\\]doc[/\\]reference[/\\]bluetooth[/\\]gatt.rst):(?P<lineno>[0-9]+): WARNING: Error in declarator$
^[ \t]*If.*:$
^[ \t]*Invalid C declaration: .* \[error at [0-9]+\]$
^[ \t]*ATOMIC_DEFINE.*$
^[- \t]*\^$
^[ \t]*If.*:$
^[ \t]*Error in declarator or parameters$
^[ \t]*Invalid C declaration: .* \[error at [0-9]+\]$
^[ \t]*ATOMIC_DEFINE.*$
^[- \t]*\^$
^[ \t]*$

6
.known-issues/make.conf Normal file
View File

@@ -0,0 +1,6 @@
#
# When filtering output of the build process, ignore lines that don't
# provide any information that helps the invoker tell if there was an
# error.
#
^make: (Entering|Leaving) directory .*

View File

@@ -0,0 +1,11 @@
#
# When executing test cases, ignore the following messages as they are
# not to be considered hard errors.
#
# Block line when test case cannot run in the HW due to server or connection issues
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/[^:]+:[^:]+: evaluation blocked(.*)$
#
# Block line when there is an issue with the YKUSH serial connection
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/(?P<board>[^:]+):[^:]+: exception: 400: (?P=board): Cannot find YKUSH serial '[A-Z0-9]+'$

View File

@@ -0,0 +1,18 @@
#
# When executing test cases under TCF, ignore the following messages
# as they are not to be considered hard errors.
#
# TCF is run under make for taking advantage of the jobserver; when
# the testcase execution fail, make will complain, which we can
# ignore ('sommersault' was the old name of the target).
#
^/tmp/tcf-[a-zA-Z0-9]+.mk:[0-9]+: recipe for target ('tcf-jobserver-run'|'sommersault') failed$
#
# More of the same
#
^make: \*\*\* \[(tcf-jobserver-run|sommersault)\] Error 1$
#
# TCF's summary line. We don't need to consider it to determine if the
# run failed or passed.
#
^[A-Z]+0/\S+:\s+\S+\s+@\S+: [0-9]+ tests \([0-9]+ passed, [0-9]+ failed, [0-9]+ blocked, [0-9]+ skipped\).*$

View File

@@ -0,0 +1,4 @@
#
# Skip line when test case is eliminated due to filters
#
^SKIP0/\S+\s+\S+: No targets can be used \(all [0-9]+ selected from [0-9]+ available eliminated by testcase filtering\)$

148
.mailmap
View File

@@ -1,124 +1,32 @@
Alexandr Kolosov <rikorsev@gmail.com>
Alexandre d'Alton <alexandre.dalton@intel.com>
Amir Kaplan <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com>
Andrzej Kuroś <andrzej.kuros@nordicsemi.no>
Anthony Smigielski <thebasti0ncode@gmail.com>
Armand Ciejak <armand@riedonetworks.com> <armandciejak@users.noreply.github.com>
Aska Wu <aska.wu@linaro.org>
Bit Pathe <bitpathe@gmail.com>
Bjarki Arge Andreasen <baa@trackunit.com>
Carles Cufi <carles.cufi@nordicsemi.no>
chao an <anchao@xiaomi.com>
Charles E. Youse <charles.youse@intel.com>
Chen Xingyu <hi@xingrz.me>
Christoph Schnetzler <christoph.schnetzler@husqvarnagroup.com>
Christoph Schramm <schramm@makaio.com>
Christopher Friedt <cfriedt@meta.com>
Christopher Friedt <cfriedt@meta.com> <cfriedt@fb.com>
Chuck Jordan <cjordan@synopsys.com>
Chunlin Han <chunlin.han@linaro.org> <chunlin.han@acer.com>
David B. Kinder <david.b.kinder@intel.com>
David Komel <a8961713@gmail.com>
David Leach <david.leach@nxp.com>
Dirk Brandewie <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com>
Enjia Mai <enjia.mai@intel.com>
Enjia Mai <enjia.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com> <Eugeniy.Paltsev@synopsys.com>
Felipe Neves <ryukokki.felipe@gmail.com>
Findlay Feng <i@fengch.me>
Flavio Arieta Netto <flavio@exati.com.br>
Francois Ramu <francois.ramu@st.com>
Gerardo Aceves <gerardo.aceves@intel.com>
Gregory Shue <gregory.shue@legrand.com>
Gregory Shue <gregory.shue@legrand.com> <gregory.shue@legrand.us>
HaiLong Yang <cameledyang@pm.me>
James Johnson <james.johnson672@t-mobile.com>
Jarno Lämsä <jarno.lamsa@nordicsemi.no>
Jędrzej Ciupis <jedrzej.ciupis@nordicsemi.no>
Jeremie Garcia <jeremie.garcia@intel.com>
Jim Benjamin Luther <jilu@oticon.com>
Johan Kruger <johan.kruger@windriver.com>
Johann Fischer <j.fischer@phytec.de>
Jørgen Kvalvaag <jorgen.kvalvaag@nordicsemi.no>
Juan Manuel Cruz Alcaraz <juan.m.cruz.alcaraz@intel.com>
Juan Solano <juanx.solano.menacho@intel.com>
Julien D'Ascenzio <julien.dascenzio@paratronic.fr>
Jun Li <jun.r.li@intel.com>
Justin Watson <jwatson5@gmail.com>
Kamil Sroka <kamil.sroka@nordicsemi.no>
Katarzyna Giądła <katarzyna.giadla@nordicsemi.no>
Keren Siman-Tov <keren.siman-tov@intel.com>
Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com>
Dirk Brandewie <dirk.j.brandewie@intel.com> <dirk.j.brandewie@intel.com>
Mike Hirst <michael.hirst@windriver.com> <michael.hirst@windriver.com>
Johan Kruger <johan.kruger@windriver.com> <johan.kruger@windriver.com>
Leona Cook <leonax.cook@intel.com> <lsc@hackeress.com>
Lixin Guo <lixinx.guo@intel.com>
Łukasz Mazur <lukasz.mazur@hidglobal.com>
Manuel Argüelles <manuel.arguelles@nxp.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <manuel.arguelles@coredumplabs.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <marguelles.dev@gmail.com>
Marc Herbert <marc.herbert@intel.com> <46978960+marc-hb@users.noreply.github.com>
Marin Jurjević <marin.jurjevic@hotmail.com>
Mariusz Ryndzionek <mariusz.ryndzionek@firmwave.com>
Mariusz Skamra <mariusz.skamra@codecoup.pl>
Mariusz Skamra <mariusz.skamra@codecoup.pl> <mariusz.skamra@tieto.com>
Martí Bolívar <marti.bolivar@nordicsemi.no>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.bolivar@linaro.org>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.f.bolivar@gmail.com>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@foundries.io>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@opensourcefoundries.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Mateusz Hołenko <mholenko@antmicro.com>
Michael Rosen <michael.r.rosen@intel.com>
Michal Narajowski <michal.narajowski@codecoup.pl>
Mike Hirst <michael.hirst@windriver.com>
Ming Shao <ming.shao@intel.com>
Mohan Kumar Kumar <mohankm@fb.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com>
Navin Sankar Velliangiri <navin@linumiz.com>
NingX Zhao <ningx.zhao@intel.com>
Leona Cook <leonax.cook@intel.com> <leonax.cook@intel.com>
Thomas Heeley <thomas.heeley@intel.com> <thomas.heeley@intel.com>
Shuang He <shuang.he@intel.com> <shuang.he@intel.com>
Chuck Jordan <cjordan@synopsys.com> <cjordan@synopsys.com>
Jeremie Garcia <jeremie.garcia@intel.com> <jeremie.garcia@intel.com>
Bit Pathe <bitpathe@gmail.com> <bitpathe@gmail.com>
Carles Cufi <carles.cufi@nordicsemi.no> <carles.cufi@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com> <kuo-lang.tseng@intel.com>
Gerardo Aceves <gerardo.aceves@intel.com> <gerardo.aceves@intel.com>
Evan Couzens <evanx.couzens@intel.com> <evanx.couzens@intel.com>
Lei Liu <lei.a.liu@intel.com> <lei.a.liu@intel.com>
Douglas Su <d0u9.su@outlook.com> <d0u9.su@outlook.com>
Keren Siman-Tov <keren.siman-tov@intel.com> <keren.siman-tov@intel.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com> <tulasi.r@tcs.com>
Felipe Neves <ryukokki.felipe@gmail.com> <ryukokki.felipe@gmail.com>
Amir Kaplan <amir.kaplan@intel.com> <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com> <anas.nashif@intel.com>
Ruud Derwig <Ruud.Derwig@synopsys.com> <Ruud.Derwig@synopsys.com>
Flavio Arieta Netto <flavio@exati.com.br>
Nishikant Nayak <nishikantax.nayak@intel.com>
Ole Sæther <ole.saether@nordicsemi.no>
Pavel Král <pavel.kral@omsquare.com>
Pavel Vasilyev <pavel.vasilyev@nordicsemi.no>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com> <pczarnecki@internships.antmicro.com>
Paweł Kwiek <pawel.kwiek@nordicsemi.no>
Peng Chen <peng1.chen@intel.com>
Peter Bigot <peter.bigot@nordicsemi.no>
Peter Bigot <peter.bigot@nordicsemi.no> <pab@pabigot.com>
Peter Johanson <peter@peterjohanson.com>
Piyush Itankar <piyush.t.itankar@intel.com>
Radosław Koppel <radoslaw.koppel@nordicsemi.no>
Radosław Koppel <radoslaw.koppel@nordicsemi.no> <r.koppel@k-el.com>
Raja D. Singh <rdsingh@iotwizards.com>
Ricardo Salveti <ricardo@opensourcefoundries.com>
Ricardo Salveti <ricardo@opensourcefoundries.com> <ricardo.salveti@linaro.org>
Ruud Derwig <Ruud.Derwig@synopsys.com>
Saku Rautio <saku.rautio@nordicsemi.no>
Scott Worley <scott.worley@microchip.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>
Sharron Liu <sharron.liu@intel.com>
Shilpashree L C <shilpashree.lc@intel.com>
Shuang He <shuang.he@intel.com>
Sigvart Hovland <sigvart.hovland@nordicsemi.no>
Stéphane D'Alu <sdalu@sdalu.com>
Stine Åkredalen <stine.akredalen@nordicsemi.no>
Thomas Heeley <thomas.heeley@intel.com>
Tim Sørensen <tims@demant.com>
Tim Sørensen <tims@demant.com> <tims@oticon.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa@gmail.com>
Justin Watson <jwatson5@gmail.com>
Johann Fischer <j.fischer@phytec.de>
Jun Li <jun.r.li@intel.com>
Xiaorui Hu <xiaorui.hu@linaro.org>
Yannis Damigos <giannis.damigos@gmail.com> <ydamigos@iccs.gr>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com> <yonattan.a.louise.mendoza@linux.intel.com>
YouhuaX Zhu <youhuax.zhu@intel.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no> <vinayak.kariappa@gmail.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +0,0 @@
# Copyright (c) 2024 Basalte bv
# SPDX-License-Identifier: Apache-2.0
extend = ".ruff-excludes.toml"
line-length = 100
target-version = "py310"
[lint]
select = [
# zephyr-keep-sorted-start
"B", # flake8-bugbear
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"SIM", # flake8-simplify
"UP", # pyupgrade
"W", # pycodestyle warnings
# zephyr-keep-sorted-stop
]
ignore = [
# zephyr-keep-sorted-start
"SIM108", # Allow if-else blocks instead of forcing ternary operator
# zephyr-keep-sorted-stop
]
[format]
quote-style = "preserve"
line-ending = "lf"

80
.uncrustify.cfg Normal file
View File

@@ -0,0 +1,80 @@
indent_with_tabs = 2 # 1=indent to level only, 2=indent with tabs
input_tab_size = 8 # original tab size
output_tab_size = 8 # new tab size
indent_columns = output_tab_size
indent_label = 1 # pos: absolute col, neg: relative column
indent_switch_case = 0 # number
#
# inter-symbol newlines
#
nl_enum_brace = remove # "enum {" vs "enum \n {"
nl_union_brace = remove # "union {" vs "union \n {"
nl_struct_brace = remove # "struct {" vs "struct \n {"
nl_do_brace = remove # "do {" vs "do \n {"
nl_if_brace = remove # "if () {" vs "if () \n {"
nl_for_brace = remove # "for () {" vs "for () \n {"
nl_else_brace = remove # "else {" vs "else \n {"
nl_while_brace = remove # "while () {" vs "while () \n {"
nl_switch_brace = remove # "switch () {" vs "switch () \n {"
nl_brace_while = remove # "} while" vs "} \n while" - cuddle while
nl_brace_else = remove # "} \n else" vs "} else"
nl_func_var_def_blk = 1
nl_fcall_brace = remove # "list_for_each() {" vs "list_for_each()\n{"
nl_fdef_brace = add # "int foo() {" vs "int foo()\n{"
#
# Source code modifications
#
mod_paren_on_return = ignore # "return 1;" vs "return (1);"
mod_full_brace_if = add # "if() { } else { }" vs "if() else"
#
# inter-character spacing options
#
sp_sizeof_paren = remove # "sizeof (int)" vs "sizeof(int)"
sp_before_sparen = force # "if (" vs "if("
sp_after_sparen = force # "if () {" vs "if (){"
sp_inside_braces = add # "{ 1 }" vs "{1}"
sp_inside_braces_struct = add # "{ 1 }" vs "{1}"
sp_inside_braces_enum = add # "{ 1 }" vs "{1}"
sp_assign = add
sp_arith = add
sp_bool = add
sp_compare = add
sp_assign = add
sp_after_comma = add
sp_func_def_paren = remove # "int foo (){" vs "int foo(){"
sp_func_call_paren = remove # "foo (" vs "foo("
sp_func_proto_paren = remove # "int foo ();" vs "int foo();"
sp_inside_fparen = remove # "func( arg )" vs "func(arg)"
sp_else_brace = add # ignore/add/remove/force
sp_before_ptr_star = add # ignore/add/remove/force
sp_after_ptr_star = remove # ignore/add/remove/force
sp_between_ptr_star = remove # ignore/add/remove/force
sp_inside_paren = remove # remove spaces inside parens
sp_paren_paren = remove # remove spaces between nested parens
sp_inside_sparen = remove # remove spaces inside parens for if, while and the like
sp_brace_else = add # ignore/add/remove/force
sp_before_nl_cont = ignore
sp_cmt_cpp_start = add
sp_brace_typedef = add # }typedefd_name -> } typedefd_name
cmt_sp_after_star_cont = 1
#
# Aligning stuff
#
align_with_tabs = FALSE # use tabs to align
align_on_tabstop = TRUE # align on tabstops
align_enum_equ_span = 4 # '=' in enum definition
align_struct_init_span = 0 # align stuff in a structure init '= { }'
align_right_cmt_span = 3
align_nl_cont = TRUE
sp_pp_concat = ignore # ignore/add/remove/force

View File

@@ -1,16 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
extends: default
rules:
line-length:
max: 100
comments:
min-spaces-from-content: 1
indentation:
spaces: 2
indent-sequences: consistent
document-start:
present: false
truthy:
check-keys: false

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,628 @@
# Instead of the CODEOWNERS file, The Zephyr Project uses a custom format.
# See MAINTAINERS.yml for details.
#
# DO NOT EDIT THIS FILE.
# CODEOWNERS for autoreview assigning in github
# https://help.github.com/en/articles/about-code-owners#codeowners-syntax
# Order is important; for each modified file, the last matching
# pattern takes the most precedence.
# That is, with the last pattern being
# *.rst @nashif
# if only .rst files are being modified, only nashif is
# automatically requested for review, but you can manually
# add others as needed.
# Do not use wildcard on all source yet
# * @galak @nashif
/.known-issues/ @nashif
/.github/ @nashif
/.github/workflows/ @galak @nashif
/.buildkite/ @galak
/MAINTAINERS.yml @ioannisg @MaureenHelm
/arch/arc/ @abrodkin @ruuddw
/arch/arm/ @MaureenHelm @galak @ioannisg
/arch/arm/core/aarch32/cortex_m/cmse/ @ioannisg
/arch/arm/core/aarch64/ @carlocaione
/arch/arm/include/aarch32/cortex_m/cmse.h @ioannisg
/arch/arm/include/aarch64/ @carlocaione
/arch/arm/core/aarch32/cortex_a_r/ @MaureenHelm @galak @ioannisg @bbolen @stephanosio
/arch/common/ @ioannisg @andyross
/soc/arc/snps_*/ @abrodkin @ruuddw
/soc/nios2/ @nashif
/soc/arm/ @MaureenHelm @galak @ioannisg
/soc/arm/arm/mps2/ @fvincenzo
/soc/arm/atmel_sam/common/*_sam4l_*.c @nandojve
/soc/arm/atmel_sam/sam3x/ @ioannisg
/soc/arm/atmel_sam/sam4e/ @nandojve
/soc/arm/atmel_sam/sam4l/ @nandojve
/soc/arm/atmel_sam/sam4s/ @fallrisk
/soc/arm/atmel_sam/same70/ @nandojve
/soc/arm/atmel_sam/samv71/ @nandojve
/soc/arm/cypress/ @nandojve
/soc/arm/bcm*/ @sbranden
/soc/arm/infineon_xmc/ @parthitce
/soc/arm/nxp*/ @MaureenHelm
/soc/arm/nordic_nrf/ @ioannisg
/soc/arm/nuvoton/ @ssekar15
/soc/arm/nuvoton_npcx/ @MulinChao
/soc/arm/qemu_cortex_a53/ @carlocaione
/soc/arm/quicklogic_eos_s3/ @kowalewskijan @kgugala
/soc/arm/silabs_exx32/efm32pg1b/ @rdmeneze
/soc/arm/silabs_exx32/efr32mg21/ @l-alfred
/soc/arm/st_stm32/ @erwango
/soc/arm/st_stm32/*/power.c @FRASTM
/soc/arm/st_stm32/stm32mp1/ @arnopo
/soc/arm/ti_simplelink/cc13x2_cc26x2/ @bwitherspoon
/soc/arm/ti_simplelink/cc32xx/ @vanti
/soc/arm/ti_simplelink/msp432p4xx/ @Mani-Sadhasivam
/soc/arm/xilinx_zynqmp/ @stephanosio
/soc/xtensa/intel_s1000/ @sathishkuttan @dcpleung
/arch/x86/ @jhedberg @nashif @jenmwms @aasthagr
/arch/nios2/ @nashif
/arch/posix/ @aescolar @daor-oti
/arch/riscv/ @kgugala @pgielda
/soc/posix/ @aescolar @daor-oti
/soc/riscv/ @kgugala @pgielda
/soc/riscv/openisa*/ @MaureenHelm
/soc/x86/ @dcpleung @nashif @jenmwms @aasthagr
/arch/xtensa/ @dcpleung @andyross @nashif
/soc/xtensa/ @dcpleung @andyross @nashif
/arch/sparc/ @martin-aberg
/soc/sparc/ @martin-aberg
/boards/arc/ @abrodkin @ruuddw
/boards/arm/ @MaureenHelm @galak
/boards/arm/96b_argonkey/ @avisconti
/boards/arm/96b_avenger96/ @Mani-Sadhasivam
/boards/arm/96b_carbon/ @idlethread
/boards/arm/96b_meerkat96/ @Mani-Sadhasivam
/boards/arm/96b_nitrogen/ @idlethread
/boards/arm/96b_neonkey/ @Mani-Sadhasivam
/boards/arm/96b_stm32_sensor_mez/ @Mani-Sadhasivam
/boards/arm/96b_wistrio/ @Mani-Sadhasivam
/boards/arm/arduino_due/ @ioannisg
/boards/arm/cc1352r1_launchxl/ @bwitherspoon
/boards/arm/cc26x2r1_launchxl/ @bwitherspoon
/boards/arm/cc3220sf_launchxl/ @vanti
/boards/arm/cy8* @nandojve
/boards/arm/disco_l475_iot1/ @erwango
/boards/arm/efm32pg_stk3401a/ @rdmeneze
/boards/arm/faze/ @mbittan @simonguinot
/boards/arm/frdm*/ @MaureenHelm
/boards/arm/frdm*/doc/ @MaureenHelm @MeganHansen
/boards/arm/google_*/ @jackrosenthal
/boards/arm/hexiwear*/ @MaureenHelm
/boards/arm/hexiwear*/doc/ @MaureenHelm @MeganHansen
/boards/arm/ip_k66f/ @parthitce
/boards/arm/lpcxpresso*/ @MaureenHelm
/boards/arm/lpcxpresso*/doc/ @MaureenHelm @MeganHansen
/boards/arm/mimx8mm_evk/ @Mani-Sadhasivam
/boards/arm/mimxrt*/ @MaureenHelm
/boards/arm/mimxrt*/doc/ @MaureenHelm @MeganHansen
/boards/arm/mps2_an385/ @fvincenzo
/boards/arm/msp_exp432p401r_launchxl/ @Mani-Sadhasivam
/boards/arm/npcx7m6fb_evb/ @MulinChao
/boards/arm/nrf*/ @carlescufi @lemrey @ioannisg
/boards/arm/nucleo*/ @erwango @ABOSTM @FRASTM
/boards/arm/nucleo_f401re/ @idlethread
/boards/arm/nuvoton_pfm_m487/ @ssekar15
/boards/arm/qemu_cortex_a53/ @carlocaione
/boards/arm/qemu_cortex_r*/ @stephanosio
/boards/arm/qemu_cortex_m*/ @ioannisg
/boards/arm/quick_feather/ @kowalewskijan @kgugala
/boards/arm/rak5010_nrf52840/ @gpaquet85
/boards/arm/xmc45_relax_kit/ @parthitce
/boards/arm/sam4e_xpro/ @nandojve
/boards/arm/sam4l_ek/ @nandojve
/boards/arm/sam4s_xplained/ @fallrisk
/boards/arm/sam_e70_xplained/ @nandojve
/boards/arm/sam_v71_xult/ @nandojve
/boards/arm/v2m_beetle/ @fvincenzo
/boards/arm/olimexino_stm32/ @ydamigos
/boards/arm/sensortile_box/ @avisconti
/boards/arm/steval_fcu001v1/ @Navin-Sankar
/boards/arm/stm32l1_disco/ @karlp
/boards/arm/stm32*_disco/ @erwango @ABOSTM @FRASTM
/boards/arm/stm32f3_disco/ @ydamigos
/boards/arm/stm32*_eval/ @erwango @ABOSTM @FRASTM
/boards/common/ @mbolivar-nordic
/boards/deprecated.cmake @tejlmand
/boards/nios2/ @nashif
/boards/nios2/altera_max10/ @nashif
/boards/arm/stm32_min_dev/ @cbsiddharth
/boards/posix/ @aescolar @daor-oti
/boards/posix/nrf52_bsim/ @aescolar @wopu-ot
/boards/riscv/ @kgugala @pgielda
/boards/riscv/rv32m1_vega/ @MaureenHelm
/boards/shields/ @erwango
/boards/shields/atmel_rf2xx/ @nandojve
/boards/shields/esp_8266/ @nandojve
/boards/shields/inventek_eswifi/ @nandojve
/boards/x86/ @dcpleung @nashif @jenmwms @aasthagr
/boards/xtensa/ @nashif @dcpleung
/boards/xtensa/intel_s1000_crb/ @sathishkuttan @dcpleung
/boards/xtensa/odroid_go/ @ydamigos
/boards/sparc/ @martin-aberg
# All cmake related files
/cmake/ @tejlmand @nashif
/cmake/*/arcmwdt/ @abrodkin @evgeniy-paltsev @tejlmand
/CMakeLists.txt @tejlmand @nashif
/doc/ @dbkinder
/doc/guides/coccinelle.rst @himanshujha199640 @JuliaLawall
/doc/CMakeLists.txt @carlescufi
/doc/scripts/ @carlescufi
/doc/guides/bluetooth/ @joerchan @jhedberg @Vudentz
/doc/guides/dts/ @galak @mbolivar-nordic
/doc/reference/bluetooth/ @joerchan @jhedberg @Vudentz
/doc/reference/devicetree/ @galak @mbolivar-nordic
/doc/reference/resource_management/ @pabigot
/doc/reference/networking/can* @alexanderwachter
/doc/security/ @ceolin @d3zd3z
/drivers/debug/ @nashif
/drivers/*/*sam4l* @nandojve
/drivers/*/*cc13xx_cc26xx* @bwitherspoon
/drivers/*/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/*/*mcux* @MaureenHelm
/drivers/*/*stm32* @erwango @ABOSTM @FRASTM
/drivers/*/*native_posix* @aescolar @daor-oti
/drivers/*/*lpc11u6x* @mbittan @simonguinot
/drivers/*/*npcx* @MulinChao
/drivers/adc/ @anangl
/drivers/adc/adc_stm32.c @cybertale
/drivers/bluetooth/ @joerchan @jhedberg @Vudentz
/drivers/can/ @alexanderwachter
/drivers/can/*mcp2515* @karstenkoenig
/drivers/clock_control/*nrf* @nordic-krch
/drivers/clock_control/*esp32* @extremegtx
/drivers/counter/ @nordic-krch
/drivers/console/ipm_console.c @finikorg
/drivers/console/semihost_console.c @luozhongyao
/drivers/counter/counter_cmos.c @dcpleung
/drivers/counter/maxim_ds3231.c @pabigot
/drivers/crypto/*nrf_ecb* @maciekfabia @anangl
/drivers/console/*mux* @jukkar
/drivers/display/ @vanwinkeljan
/drivers/display/display_framebuf.c @dcpleung
/drivers/dac/ @martinjaeger
/drivers/dma/*dw* @tbursztyka
/drivers/dma/*sam0* @Sizurka
/drivers/dma/dma_stm32* @cybertale @lowlander
/drivers/dma/*pl330* @raveenp
/drivers/dma/*iproc_pax* @raveenp
/drivers/ec_host_cmd_periph/ @jettr
/drivers/edac/ @finikorg
/drivers/eeprom/ @henrikbrixandersen
/drivers/eeprom/eeprom_stm32.c @KwonTae-young
/drivers/entropy/*rv32m1* @MaureenHelm
/drivers/entropy/*gecko* @chrta
/drivers/entropy/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/espi/ @albertofloyd @franciscomunoz @scottwcpg
/drivers/ethernet/ @jukkar @tbursztyka @pfalcon
/drivers/ethernet/*stm32* @Nukersson @lochej
/drivers/ethernet/*w5500* @parthitce
/drivers/flash/ @nashif @nvlsianpu
/drivers/flash/*nrf* @nvlsianpu
/drivers/flash/*spi_nor* @pabigot
/drivers/gpio/ @mnkp @pabigot
/drivers/gpio/*ht16k33* @henrikbrixandersen
/drivers/gpio/*lmp90xxx* @henrikbrixandersen
/drivers/gpio/*stm32* @erwango
/drivers/gpio/*sx1509b* @pabigot
/drivers/hwinfo/ @alexanderwachter
/drivers/i2s/i2s_ll_stm32* @avisconti
/drivers/i2c/i2c_common.c @sjg20
/drivers/i2c/i2c_emul.c @sjg20
/drivers/i2c/i2c_ite_it8xxx2.c @GTLin08
/drivers/i2c/i2c_shell.c @nashif
/drivers/i2c/Kconfig.i2c_emul @sjg20
/drivers/i2c/Kconfig.it8xxx2 @GTLin08
/drivers/i2c/slave/*eeprom* @henrikbrixandersen
/drivers/i2s/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/ieee802154/ @jukkar @tbursztyka
/drivers/ieee802154/ieee802154_rf2xx* @jukkar @tbursztyka @nandojve
/drivers/ieee802154/ieee802154_cc13xx* @bwitherspoon @cfriedt
/drivers/interrupt_controller/ @dcpleung @nashif
/drivers/interrupt_controller/intc_gic.c @stephanosio
/drivers/ipm/ipm_mhu* @karl-zh
/drivers/ipm/Kconfig.nrfx @masz-nordic @ioannisg
/drivers/ipm/Kconfig.nrfx_ipc_channel @masz-nordic @ioannisg
/drivers/ipm/ipm_intel_adsp.c @finikorg
/drivers/ipm/ipm_cavs_idc* @dcpleung
/drivers/ipm/ipm_nrfx_ipc.c @masz-nordic @ioannisg
/drivers/ipm/ipm_nrfx_ipc.h @masz-nordic @ioannisg
/drivers/ipm/ipm_stm32_ipcc.c @arnopo
/drivers/kscan/ @albertofloyd @franciscomunoz @scottwcpg
/drivers/led/ @Mani-Sadhasivam
/drivers/led_strip/ @mbolivar-nordic
/drivers/lora/ @Mani-Sadhasivam
/drivers/memc/ @gmarull
/drivers/modem/*gsm* @jukkar
/drivers/modem/hl7800.c @rerickson1
/drivers/modem/Kconfig.hl7800 @rerickson1
/drivers/pcie/ @dcpleung @nashif @jhedberg
/drivers/peci/ @albertofloyd @franciscomunoz @scottwcpg
/drivers/pinmux/*hsdk* @iriszzw
/drivers/pinmux/*it8xxx2* @ite
/drivers/psci/ @carlocaione
/drivers/ps2/ @albertofloyd @franciscomunoz @scottwcpg
/drivers/pwm/*rv32m1* @henrikbrixandersen
/drivers/pwm/*sam0* @nzmichaelh
/drivers/pwm/*stm32* @gmarull
/drivers/pwm/*xlnx* @henrikbrixandersen
/drivers/pwm/pwm_capture.c @henrikbrixandersen
/drivers/pwm/pwm_shell.c @henrikbrixandersen
/drivers/regulator/ @pabigot
/drivers/sensor/ @MaureenHelm
/drivers/sensor/ams_iAQcore/ @alexanderwachter
/drivers/sensor/ens210/ @alexanderwachter
/drivers/sensor/hts*/ @avisconti
/drivers/sensor/lis*/ @avisconti
/drivers/sensor/lps*/ @avisconti
/drivers/sensor/lsm*/ @avisconti
/drivers/sensor/mpr/ @sven-hm
/drivers/sensor/st*/ @avisconti
/drivers/serial/uart_altera_jtag_hal.c @nashif
/drivers/serial/*ns16550* @dcpleung @nashif @jenmwms @aasthagr
/drivers/serial/*nrfx* @Mierunski @anangl
/drivers/serial/uart_liteuart.c @mateusz-holenko @kgugala @pgielda
/drivers/serial/Kconfig.mcux_iuart @Mani-Sadhasivam
/drivers/serial/uart_mcux_iuart.c @Mani-Sadhasivam
/drivers/serial/Kconfig.rtt @carlescufi @pkral78
/drivers/serial/uart_rtt.c @carlescufi @pkral78
/drivers/serial/Kconfig.xlnx @wjliang
/drivers/serial/uart_xlnx_ps.c @wjliang
/drivers/serial/uart_xlnx_uartlite.c @henrikbrixandersen
/drivers/serial/*xmc4xxx* @parthitce
/drivers/serial/*nuvoton* @ssekar15
/drivers/serial/*apbuart* @martin-aberg
/drivers/net/ @jukkar @tbursztyka
/drivers/ptp_clock/ @jukkar
/drivers/spi/ @tbursztyka
/drivers/spi/spi_rv32m1_lpspi* @karstenkoenig
/drivers/timer/apic_timer.c @dcpleung @nashif
/drivers/timer/arm_arch_timer.c @carlocaione
/drivers/timer/cortex_m_systick.c @ioannisg
/drivers/timer/altera_avalon_timer_hal.c @nashif
/drivers/timer/riscv_machine_timer.c @kgugala @pgielda
/drivers/timer/ite_it8xxx2_timer.c @ite
/drivers/timer/xlnx_psttc_timer* @wjliang @stephanosio
/drivers/timer/cc13x2_cc26x2_rtc_timer.c @vanti
/drivers/timer/cavs_timer.c @dcpleung
/drivers/timer/stm32_lptim_timer.c @FRASTM
/drivers/timer/leon_gptimer.c @martin-aberg
/drivers/usb/ @jfischer-no @finikorg
/drivers/usb/device/usb_dc_stm32.c @ydamigos @loicpoulain
/drivers/video/ @loicpoulain
/drivers/i2c/i2c_ll_stm32* @ydamigos
/drivers/i2c/i2c_rv32m1_lpi2c* @henrikbrixandersen
/drivers/i2c/*sam0* @Sizurka
/drivers/i2c/i2c_dw* @dcpleung
/drivers/*/*xec* @franciscomunoz @albertofloyd @scottwcpg
/drivers/watchdog/*gecko* @oanerer
/drivers/watchdog/*sifive* @katsuster
/drivers/watchdog/wdt_handlers.c @dcpleung @nashif
/drivers/wifi/ @jukkar @tbursztyka @pfalcon
/drivers/wifi/esp/ @mniestroj
/drivers/wifi/eswifi/ @loicpoulain @nandojve
/drivers/wifi/winc1500/ @kludentwo
/drivers/virtualization/ @tbursztyka
/dts/arc/ @abrodkin @ruuddw @iriszzw
/dts/arm/atmel/sam4e* @nandojve
/dts/arm/atmel/sam4l* @nandojve
/dts/arm/atmel/samr21.dtsi @benpicco
/dts/arm/atmel/sam*5*.dtsi @benpicco
/dts/arm/atmel/same70* @nandojve
/dts/arm/atmel/samv71* @nandojve
/dts/arm/atmel/ @galak
/dts/arm/broadcom/ @sbranden
/dts/arm/infineon/ @parthitce
/dts/arm/qemu-virt/ @carlocaione
/dts/arm/quicklogic/ @wtatarski @kowalewskijan @kgugala
/dts/arm/st/ @erwango
/dts/arm/ti/cc13?2* @bwitherspoon
/dts/arm/ti/cc26?2* @bwitherspoon
/dts/arm/ti/cc3235* @vanti
/dts/arm/nordic/ @ioannisg @carlescufi
/dts/arm/nuvoton/ @ssekar15
/dts/arm/nuvoton/npcx/ @MulinChao
/dts/arm/nxp/ @MaureenHelm
/dts/arm/microchip/ @franciscomunoz @albertofloyd @scottwcpg
/dts/arm/silabs/efm32_pg_1b.dtsi @rdmeneze
/dts/arm/silabs/efm32gg11b* @oanerer
/dts/arm/silabs/efm32_jg_pg* @chrta
/dts/arm/silabs/efr32bg13p* @mnkp
/dts/arm/silabs/efm32jg12b* @chrta
/dts/arm/silabs/efm32pg12b* @chrta
/dts/arm/silabs/efm32pg1b* @rdmeneze
/dts/arm/silabs/efr32mg21* @l-alfred
/dts/riscv/ @kgugala @pgielda
/dts/riscv/it8xxx2.dtsi @ite
/dts/riscv/microsemi-miv.dtsi @galak
/dts/riscv/rv32m1* @MaureenHelm
/dts/riscv/riscv32-litex-vexriscv.dtsi @mateusz-holenko @kgugala @pgielda
/dts/arm/armv7-r.dtsi @bbolen @stephanosio
/dts/arm/armv8-a.dtsi @carlocaione
/dts/arm/xilinx/ @bbolen @stephanosio
/dts/x86/ @jhedberg
/dts/xtensa/xtensa.dtsi @ydamigos
/dts/xtensa/intel/ @dcpleung
/dts/sparc/ @martin-aberg
/dts/bindings/ @galak
/dts/bindings/can/ @alexanderwachter
/dts/bindings/i2c/zephyr*i2c-emul.yaml @sjg20
/dts/bindings/adc/st*stm32-adc.yaml @cybertale
/dts/bindings/modem/*hl7800.yaml @rerickson1
/dts/bindings/serial/ns16550.yaml @dcpleung @nashif
/dts/bindings/wifi/*esp.yaml @mniestroj
/dts/bindings/*/*npcx* @MulinChao
/dts/bindings/*/nordic* @anangl
/dts/bindings/*/nxp* @MaureenHelm
/dts/bindings/*/openisa* @MaureenHelm
/dts/bindings/*/st* @erwango
/dts/bindings/sensor/ams* @alexanderwachter
/dts/bindings/*/sifive* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/litex* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/vexriscv* @mateusz-holenko @kgugala @pgielda
/dts/bindings/psci/* @carlocaione
/dts/posix/ @aescolar @vanwinkeljan @daor-oti
/dts/bindings/sensor/*bme680* @BoschSensortec
/dts/bindings/sensor/st* @avisconti
/include/ @nashif @carlescufi @galak @MaureenHelm
/include/drivers/*/*litex* @mateusz-holenko @kgugala @pgielda
/include/drivers/adc.h @anangl
/include/drivers/can.h @alexanderwachter
/include/drivers/counter.h @nordic-krch
/include/drivers/dac.h @martinjaeger
/include/drivers/display.h @vanwinkeljan
/include/drivers/espi.h @albertofloyd @franciscomunoz @scottwcpg
/include/drivers/bluetooth/ @joerchan @jhedberg @Vudentz
/include/drivers/flash.h @nashif @carlescufi @galak @MaureenHelm @nvlsianpu
/include/drivers/i2c_emul.h @sjg20
/include/drivers/led/ht16k33.h @henrikbrixandersen
/include/drivers/interrupt_controller/ @dcpleung @nashif
/include/drivers/interrupt_controller/gic.h @stephanosio
/include/drivers/modem/hl7800.h @rerickson1
/include/drivers/pcie/ @dcpleung
/include/drivers/hwinfo.h @alexanderwachter
/include/drivers/led.h @Mani-Sadhasivam
/include/drivers/led_strip.h @mbolivar-nordic
/include/drivers/sensor.h @MaureenHelm
/include/drivers/spi.h @tbursztyka
/include/drivers/lora.h @Mani-Sadhasivam
/include/drivers/peci.h @albertofloyd @franciscomunoz @scottwcpg
/include/drivers/psci.h @carlocaione
/include/app_memory/ @dcpleung
/include/arch/arc/ @abrodkin @ruuddw
/include/arch/arc/arch.h @abrodkin @ruuddw
/include/arch/arc/v2/irq.h @abrodkin @ruuddw
/include/arch/arm/aarch32/ @MaureenHelm @galak @ioannisg
/include/arch/arm/aarch32/cortex_a_r/ @stephanosio
/include/arch/arm/aarch64/ @carlocaione
/include/arch/arm/arm-smccc.h @carlocaione
/include/arch/arm/aarch32/irq.h @carlocaione
/include/arch/nios2/ @nashif
/include/arch/nios2/arch.h @nashif
/include/arch/posix/ @aescolar @daor-oti
/include/arch/riscv/ @kgugala @pgielda
/include/arch/x86/ @jhedberg @dcpleung
/include/arch/common/ @andyross @nashif
/include/arch/xtensa/ @andyross @dcpleung
/include/arch/sparc/ @martin-aberg
/include/sys/atomic.h @andyross
/include/bluetooth/ @joerchan @jhedberg @Vudentz
/include/cache.h @carlocaione @andyross
/include/canbus/ @alexanderwachter
/include/tracing/ @nashif
/include/debug/ @nashif
/include/debug/coredump.h @dcpleung
/include/debug/gdbstub.h @ceolin
/include/device.h @tbursztyka @nashif
/include/devicetree.h @galak
/include/display/ @vanwinkeljan
/include/dt-bindings/clock/kinetis_mcg.h @henrikbrixandersen
/include/dt-bindings/clock/kinetis_scg.h @henrikbrixandersen
/include/dt-bindings/dma/stm32_dma.h @cybertale
/include/dt-bindings/pcie/ @dcpleung
/include/dt-bindings/usb/usb.h @galak @finikorg
/include/emul.h @sjg20
/include/fs/ @nashif @nvlsianpu @de-nordic @pabigot
/include/init.h @nashif @andyross
/include/irq.h @dcpleung @nashif @andyross
/include/irq_offload.h @dcpleung @nashif @andyross
/include/kernel.h @dcpleung @nashif @andyross
/include/kernel_version.h @dcpleung @nashif @andyross
/include/linker/app_smem*.ld @dcpleung @nashif
/include/linker/ @dcpleung @nashif @andyross
/include/logging/ @nordic-krch
/include/lorawan/lorawan.h @Mani-Sadhasivam
/include/mgmt/osdp.h @cbsiddharth
/include/net/ @jukkar @tbursztyka @pfalcon
/include/net/buf.h @jukkar @jhedberg @tbursztyka @pfalcon
/include/net/coap*.h @jukkar @rlubos
/include/net/lwm2m*.h @jukkar @rlubos
/include/net/mqtt.h @jukkar @rlubos
/include/posix/ @pfalcon
/include/power/power.h @pabigot @nashif @ceolin
/include/ptp_clock.h @jukkar
/include/shared_irq.h @dcpleung @nashif @andyross
/include/shell/ @jakub-uC @nordic-krch
/include/sw_isr_table.h @dcpleung @nashif @andyross
/include/sys_clock.h @dcpleung @nashif @andyross
/include/sys/sys_io.h @dcpleung @nashif @andyross
/include/sys/kobject.h @dcpleung @nashif
/include/toolchain.h @dcpleung @andyross @nashif
/include/toolchain/ @dcpleung @nashif @andyross
/include/zephyr.h @dcpleung @nashif @andyross
/kernel/ @dcpleung @nashif @andyross
/lib/fnmatch/ @carlescufi
/lib/gui/ @vanwinkeljan
/lib/open-amp/ @arnopo
/lib/os/ @dcpleung @nashif @andyross
/lib/posix/ @pfalcon
/lib/cmsis_rtos_v2/ @nashif
/lib/cmsis_rtos_v1/ @nashif
/lib/libc/ @nashif
/modules/ @nashif
/modules/Kconfig.tfm @ioannisg @microbuilder
/kernel/device.c @andyross @nashif
/kernel/idle.c @andyross @nashif
/samples/ @nashif
/samples/basic/minimal/ @carlescufi
/samples/basic/servo_motor/boards/*microbit* @jhe
/samples/bluetooth/ @jhedberg @Vudentz @joerchan
/samples/boards/intel_s1000_crb/ @sathishkuttan @dcpleung @nashif
/samples/display/ @vanwinkeljan
/samples/drivers/can/ @alexanderwachter
/samples/drivers/clock_control_litex/ @mateusz-holenko @kgugala @pgielda
/samples/drivers/display/ @vanwinkeljan
/samples/drivers/ht16k33/ @henrikbrixandersen
/samples/drivers/lora/ @Mani-Sadhasivam
/samples/drivers/counter/maxim_ds3231/ @pabigot
/samples/lorawan/ @Mani-Sadhasivam
/samples/net/ @jukkar @tbursztyka @pfalcon
/samples/net/cloud/tagoio_http_post/ @nandojve
/samples/net/dns_resolve/ @jukkar @tbursztyka @pfalcon
/samples/net/lwm2m_client/ @rlubos
/samples/net/mqtt_publisher/ @jukkar @tbursztyka
/samples/net/sockets/coap_*/ @rlubos
/samples/net/sockets/ @jukkar @tbursztyka @pfalcon
/samples/net/*civetweb* @Nukersson
/samples/sensor/ @MaureenHelm
/samples/shields/ @avisconti
/samples/subsys/logging/ @nordic-krch @jakub-uC
/samples/subsys/shell/ @jakub-uC @nordic-krch
/samples/subsys/mgmt/mcumgr/smp_svr/ @aunsbjerg @nvlsianpu
/samples/subsys/mgmt/updatehub/ @nandojve @otavio
/samples/subsys/mgmt/osdp/ @cbsiddharth
/samples/subsys/usb/ @jfischer-no @finikorg
/samples/subsys/power/ @nashif @pabigot @ceolin
/samples/tfm_integration/ @ioannisg @microbuilder
/samples/userspace/ @dcpleung @nashif
/scripts/coccicheck @himanshujha199640 @JuliaLawall
/scripts/coccinelle/ @himanshujha199640 @JuliaLawall
/scripts/coredump/ @dcpleung
/scripts/kconfig/ @ulfalizer
/scripts/pylib/twister/expr_parser.py @nashif
/scripts/schemas/twister/ @nashif
/scripts/gen_app_partitions.py @dcpleung @nashif
/scripts/get_maintainer.py @nashif
/scripts/dts/ @mbolivar-nordic @galak
/scripts/release/ @nashif
/scripts/ci/ @nashif
/arch/x86/gen_gdt.py @dcpleung @nashif
/arch/x86/gen_idt.py @dcpleung @nashif
/scripts/gen_kobject_list.py @dcpleung @nashif
/scripts/gen_syscalls.py @dcpleung @nashif
/scripts/list_boards.py @mbolivar-nordic
/scripts/net/ @jukkar
/scripts/process_gperf.py @dcpleung @nashif
/scripts/gen_relocate_app.py @dcpleung
/scripts/requirements*.txt @mbolivar-nordic @galak @nashif
/scripts/tests/twister/ @aasthagr
/scripts/tests/build/test_subfolder_list.py @rmstoi
/scripts/tracing/ @nashif
/scripts/pylib/twister/ @nashif
/scripts/twister @nashif
/scripts/series-push-hook.sh @erwango
/scripts/west_commands/ @mbolivar-nordic
/scripts/west-commands.yml @mbolivar-nordic
/scripts/zephyr_module.py @tejlmand
/scripts/user_wordsize.py @cfriedt
/scripts/valgrind.supp @aescolar @daor-oti
/share/zephyr-package/ @tejlmand
/share/zephyrunittest-package/ @tejlmand
/subsys/bluetooth/ @joerchan @jhedberg @Vudentz
/subsys/bluetooth/controller/ @carlescufi @cvinayak @thoh-ot @kruithofa
/subsys/bluetooth/mesh/ @jhedberg @trond-snekvik @joerchan @Vudentz
/subsys/canbus/ @alexanderwachter
/subsys/cpp/ @pabigot @vanwinkeljan
/subsys/debug/ @nashif
/subsys/debug/coredump/ @dcpleung
/subsys/debug/gdbstub/ @ceolin
/subsys/debug/gdbstub.c @ceolin
/subsys/dfu/ @nvlsianpu
/subsys/tracing/ @nashif
/subsys/debug/asan_hacks.c @vanwinkeljan @aescolar @daor-oti
/subsys/demand_paging/ @dcpleung @nashif
/subsys/disk/disk_access_spi_sdhc.c @JunYangNXP
/subsys/disk/disk_access_sdhc.h @JunYangNXP
/subsys/disk/disk_access_usdhc.c @JunYangNXP
/subsys/disk/disk_access_stm32_sdmmc.c @anthonybrandon
/subsys/emul/ @sjg20
/subsys/fb/ @jfischer-no
/subsys/fs/ @nashif
/subsys/fs/fcb/ @nvlsianpu
/subsys/fs/fuse_fs_access.c @vanwinkeljan
/subsys/fs/littlefs_fs.c @pabigot
/subsys/fs/nvs/ @Laczen
/subsys/ipc/ @ioannisg
/subsys/logging/ @nordic-krch
/subsys/logging/log_backend_net.c @nordic-krch @jukkar
/subsys/lorawan/ @Mani-Sadhasivam
/subsys/mgmt/ec_host_cmd/ @jettr
/subsys/mgmt/mcumgr/ @carlescufi @nvlsianpu
/subsys/mgmt/hawkbit/ @Navin-Sankar
/subsys/mgmt/mcumgr/smp_udp.c @aunsbjerg
/subsys/mgmt/updatehub/ @nandojve @otavio
/subsys/mgmt/osdp/ @cbsiddharth
/subsys/net/buf.c @jukkar @jhedberg @tbursztyka @pfalcon
/subsys/net/ip/ @jukkar @tbursztyka @pfalcon
/subsys/net/lib/ @jukkar @tbursztyka @pfalcon
/subsys/net/lib/dns/ @jukkar @tbursztyka @pfalcon @cfriedt
/subsys/net/lib/lwm2m/ @rlubos
/subsys/net/lib/config/ @jukkar @tbursztyka @pfalcon
/subsys/net/lib/mqtt/ @jukkar @tbursztyka @rlubos
/subsys/net/lib/coap/ @rlubos
/subsys/net/lib/sockets/socketpair.c @cfriedt
/subsys/net/lib/sockets/ @jukkar @tbursztyka @pfalcon
/subsys/net/lib/tls_credentials/ @rlubos
/subsys/net/l2/ @jukkar @tbursztyka
/subsys/net/l2/canbus/ @alexanderwachter @jukkar
/subsys/net/*/openthread/ @rlubos
/subsys/power/ @nashif @pabigot @ceolin
/subsys/random/ @dleach02
/subsys/settings/ @nvlsianpu
/subsys/shell/ @jakub-uC @nordic-krch
/subsys/stats/ @nvlsianpu
/subsys/storage/ @nvlsianpu
/subsys/testsuite/ @nashif
/subsys/timing/ @nashif @dcpleung
/subsys/usb/ @jfischer-no @finikorg
/subsys/usb/class/dfu/usb_dfu.c @nvlsianpu
/tests/ @nashif
/tests/application_development/libcxx/ @pabigot
/tests/arch/arm/ @ioannisg @stephanosio
/tests/benchmarks/cmsis_dsp/ @stephanosio
/tests/boards/native_posix/ @aescolar @daor-oti
/tests/boards/intel_s1000_crb/ @dcpleung @sathishkuttan
/tests/bluetooth/ @joerchan @jhedberg @Vudentz
/tests/bluetooth/bsim_bt/ @joerchan @jhedberg @Vudentz @aescolar @wopu-ot
/tests/posix/ @pfalcon
/tests/crypto/ @ceolin
/tests/crypto/mbedtls/ @nashif @ceolin
/tests/drivers/can/ @alexanderwachter
/tests/drivers/counter/ @nordic-krch
/tests/drivers/counter/maxim_ds3231_api/ @pabigot
/tests/drivers/eeprom/ @henrikbrixandersen @sjg20
/tests/drivers/flash_simulator/ @nvlsianpu
/tests/drivers/gpio/ @mnkp @pabigot
/tests/drivers/hwinfo/ @alexanderwachter
/tests/drivers/spi/ @tbursztyka
/tests/drivers/uart/uart_async_api/ @Mierunski
/tests/kernel/ @dcpleung @andyross @nashif
/tests/lib/ @nashif
/tests/lib/cmsis_dsp/ @stephanosio
/tests/net/ @jukkar @tbursztyka @pfalcon
/tests/net/buf/ @jukkar @jhedberg @tbursztyka @pfalcon
/tests/net/lib/ @jukkar @tbursztyka @pfalcon
/tests/net/lib/http_header_fields/ @jukkar @tbursztyka
/tests/net/lib/mqtt_packet/ @jukkar @tbursztyka
/tests/net/lib/coap/ @rlubos
/tests/net/socket/socketpair/ @cfriedt
/tests/net/socket/ @jukkar @tbursztyka @pfalcon
/tests/subsys/debug/coredump/ @dcpleung
/tests/subsys/fs/ @nashif @nvlsianpu @de-nordic @pabigot
/tests/subsys/settings/ @nvlsianpu
/tests/subsys/shell/ @jakub-uC @nordic-krch
# Get all docs reviewed
*.rst @nashif
/doc/reference/kernel/ @andyross @nashif
*posix*.rst @aescolar @daor-oti

View File

@@ -1,139 +1,78 @@
# Zephyr Project Code of Conduct
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, caste, color, religion, or sexual
identity and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
Examples of behavior that contributes to creating a positive environment
include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the overall
community
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior include:
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery, and sexual attention or advances of
any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email address,
without their explicit permission
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
professional setting
## Enforcement Responsibilities
## Our Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
conduct@zephyrproject.org. Reports will be received by the Chair of the Zephyr
Governing Board, the Zephyr Project Director (Linux Foundation), and the Zephyr
Project Developer Advocate (Linux Foundation). You may refer to the [Governing
Board](https://zephyrproject.org/governing-board/) and [Linux Foundation
Staff](https://zephyrproject.org/staff/) web pages to identify who are the
individuals currently holding these positions.
All complaints will be reviewed and investigated promptly and fairly.
reported by contacting the project team at conduct@zephyrproject.org.
Reports will be received by Kate Stewart (Linux Foundation) and Amy Occhialino
(Intel). All complaints will be reviewed and investigated, and will result in a
response that is deemed necessary and appropriate to the circumstances. The
project team is obligated to maintain confidentiality with regard to the
reporter of an incident. Further details of specific enforcement policies may
be posted separately.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series of
actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or permanent
ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within the
community.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.1, available at
[https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1].
The only changes made by The Zephyr Project to the original document were to
make explicit who the recipients of Code of Conduct incident reports are.
Community Impact Guidelines were inspired by
[Mozilla's code of conduct enforcement ladder][Mozilla CoC].
For answers to common questions about this code of conduct, see the FAQ at
[https://www.contributor-covenant.org/faq][FAQ]. Translations are available at
[https://www.contributor-covenant.org/translations][translations].
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
[homepage]: https://www.contributor-covenant.org
[v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html
[Mozilla CoC]: https://github.com/mozilla/diversity
[FAQ]: https://www.contributor-covenant.org/faq
[translations]: https://www.contributor-covenant.org/translations
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq

View File

@@ -1,19 +0,0 @@
# Constant variables to be used across Kconfig options
# Copyright (c) 2024 basalte bv
# SPDX-License-Identifier: Apache-2.0
INT8_MIN := -128
INT16_MIN := -32768
INT32_MIN := -2147483648
INT64_MIN := -9223372036854775808
INT8_MAX := 127
INT16_MAX := 32767
INT32_MAX := 2147483647
INT64_MAX := 9223372036854775807
UINT8_MAX := 255
UINT16_MAX := 65535
UINT32_MAX := 4294967295
UINT64_MAX := 18446744073709551615

File diff suppressed because it is too large Load Diff

View File

@@ -1,202 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -1,10 +0,0 @@
> [!IMPORTANT]
> The license files in this directory are maintained as per the recommendation
> of the REUSE specification (https://reuse.software/spec-3.3).
> These are licenses that may apply to some components hosted in the source tree
> but that do not end up in built binaries
> (see https://docs.zephyrproject.org/latest/LICENSING.html#zephyr-licensing).
>
> These files do _not_ define the license of the Zephyr project itself.
> Zephyr is licensed under the Apache 2.0 license, as specified in
> the [LICENSE](/LICENSE) file at the root of the repository.

File diff suppressed because it is too large Load Diff

30
Makefile Normal file
View File

@@ -0,0 +1,30 @@
#
# Top level makefile for documentation build
#
BUILDDIR ?= doc/_build
DOC_TAG ?= development
SPHINXOPTS ?= -q
KCONFIG_TURBO_MODE ?= 0
# Documentation targets
# ---------------------------------------------------------------------------
.PHONY: clean htmldocs htmldocs-fast pdfdocs doxygen
htmldocs-fast:
${MAKE} htmldocs KCONFIG_TURBO_MODE=1
htmldocs pdfdocs doxygen: configure
cmake --build ${BUILDDIR} -- $@ # -v # VERBOSE=1
# Run CMake every time cause it's quick and re-configures TURBO_MODE if
# needed
.PHONY: configure
configure:
cmake -GNinja -B${BUILDDIR} -Sdoc/ -DDOC_TAG=${DOC_TAG} \
-DSPHINXOPTS=${SPHINXOPTS} \
-DKCONFIG_TURBO_MODE=${KCONFIG_TURBO_MODE}
clean:
rm -rf ${BUILDDIR}

View File

@@ -2,17 +2,15 @@
<a href="https://www.zephyrproject.org">
<p align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="doc/_static/images/logo-readme-dark.svg">
<source media="(prefers-color-scheme: light)" srcset="doc/_static/images/logo-readme-light.svg">
<img src="doc/_static/images/logo-readme-light.svg">
</picture>
<img src="doc/images/Zephyr-Project.png">
</p>
</a>
<a href="https://bestpractices.coreinfrastructure.org/projects/74"><img src="https://bestpractices.coreinfrastructure.org/projects/74/badge"></a>
<a href="https://scorecard.dev/viewer/?uri=github.com/zephyrproject-rtos/zephyr"><img src="https://api.securityscorecards.dev/projects/github.com/zephyrproject-rtos/zephyr/badge"></a>
<a href="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml?query=branch%3Amain"><img src="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml/badge.svg?event=push"></a>
<a href="https://bestpractices.coreinfrastructure.org/projects/74"><img
src="https://bestpractices.coreinfrastructure.org/projects/74/badge"></a>
<a href="https://buildkite.com/zephyr/zephyr">
<img
src="https://badge.buildkite.com/f5bd0dc88306cee17c9b38e78d11bb74a6291e3f40e7d13f31.svg?branch=master"></a>
The Zephyr Project is a scalable real-time operating system (RTOS) supporting
@@ -23,12 +21,13 @@ The Zephyr OS is based on a small-footprint kernel designed for use on
resource-constrained systems: from simple embedded environmental sensors and
LED wearables to sophisticated smart watches and IoT wireless gateways.
The Zephyr kernel supports multiple architectures, including ARM (Cortex-A,
Cortex-R, Cortex-M), Intel x86, ARC, Tensilica Xtensa, and RISC-V,
SPARC, MIPS, and a large number of `supported boards`_.
The Zephyr kernel supports multiple architectures, including ARM Cortex-M,
Intel x86, ARC, Nios II, Tensilica Xtensa, and RISC-V, and a large number of
`supported boards`_.
.. below included in doc/introduction/introduction.rst
.. start_include_here
Getting Started
***************
@@ -36,12 +35,10 @@ Getting Started
Welcome to Zephyr! See the `Introduction to Zephyr`_ for a high-level overview,
and the documentation's `Getting Started Guide`_ to start developing.
.. start_include_here
Community Support
*****************
Community support is provided via mailing lists and Discord; see the Resources
Community support is provided via mailing lists and Slack; see the Resources
below for details.
.. _project-resources:
@@ -51,59 +48,40 @@ Resources
Here's a quick summary of resources to help you find your way around:
Getting Started
---------------
* **Help**: `Asking for Help Tips`_
* **Documentation**: http://docs.zephyrproject.org (`Getting Started Guide`_)
* **Source Code**: https://github.com/zephyrproject-rtos/zephyr is the main
repository; https://elixir.bootlin.com/zephyr/latest/source contains a
searchable index
* **Releases**: https://github.com/zephyrproject-rtos/zephyr/releases
* **Samples and example code**: see `Sample and Demo Code Examples`_
* **Mailing Lists**: users@lists.zephyrproject.org and
devel@lists.zephyrproject.org are the main user and developer mailing lists,
respectively. You can join the developer's list and search its archives at
`Zephyr Development mailing list`_. The other `Zephyr mailing list
subgroups`_ have their own archives and sign-up pages.
* **Nightly CI Build Status**: https://lists.zephyrproject.org/g/builds
The builds@lists.zephyrproject.org mailing list archives the CI
(shippable) nightly build results.
* **Chat**: Zephyr's Slack workspace is https://zephyrproject.slack.com. Use
this `Slack Invite`_ to register.
* **Contributing**: see the `Contribution Guide`_
* **Wiki**: `Zephyr GitHub wiki`_
* **Issues**: https://github.com/zephyrproject-rtos/zephyr/issues
* **Security Issues**: Email vulnerabilities@zephyrproject.org to report
security issues; also see our `Security`_ documentation. Security issues are
tracked separately at https://zephyrprojectsec.atlassian.net.
* **Zephyr Project Website**: https://zephyrproject.org
| 📖 `Zephyr Documentation`_
| 🚀 `Getting Started Guide`_
| 🙋🏽 `Tips when asking for help`_
| 💻 `Code samples`_
Code and Development
--------------------
| 🌐 `Source Code Repository`_
| 📦 `Releases`_
| 🤝 `Contribution Guide`_
Community and Support
---------------------
| 💬 `Discord Server`_ for real-time community discussions
| 📧 `User mailing list (users@lists.zephyrproject.org)`_
| 📧 `Developer mailing list (devel@lists.zephyrproject.org)`_
| 📬 `Other project mailing lists`_
| 📚 `Project Wiki`_
Issue Tracking and Security
---------------------------
| 🐛 `GitHub Issues`_
| 🔒 `Security documentation`_
| 🛡️ `Security Advisories Repository`_
| ⚠️ Report security vulnerabilities at vulnerabilities@zephyrproject.org
Additional Resources
--------------------
| 🌐 `Zephyr Project Website`_
| 📺 `Zephyr Tech Talks`_
.. _Zephyr Project Website: https://www.zephyrproject.org
.. _Discord Server: https://chat.zephyrproject.org
.. _supported boards: https://docs.zephyrproject.org/latest/boards/index.html
.. _Zephyr Documentation: https://docs.zephyrproject.org
.. _Introduction to Zephyr: https://docs.zephyrproject.org/latest/introduction/index.html
.. _Getting Started Guide: https://docs.zephyrproject.org/latest/develop/getting_started/index.html
.. _Contribution Guide: https://docs.zephyrproject.org/latest/contribute/index.html
.. _Source Code Repository: https://github.com/zephyrproject-rtos/zephyr
.. _GitHub Issues: https://github.com/zephyrproject-rtos/zephyr/issues
.. _Releases: https://github.com/zephyrproject-rtos/zephyr/releases
.. _Project Wiki: https://github.com/zephyrproject-rtos/zephyr/wiki
.. _User mailing list (users@lists.zephyrproject.org): https://lists.zephyrproject.org/g/users
.. _Developer mailing list (devel@lists.zephyrproject.org): https://lists.zephyrproject.org/g/devel
.. _Other project mailing lists: https://lists.zephyrproject.org/g/main/subgroups
.. _Code samples: https://docs.zephyrproject.org/latest/samples/index.html
.. _Security documentation: https://docs.zephyrproject.org/latest/security/index.html
.. _Security Advisories Repository: https://github.com/zephyrproject-rtos/zephyr/security
.. _Tips when asking for help: https://docs.zephyrproject.org/latest/develop/getting_started/index.html#asking-for-help
.. _Zephyr Tech Talks: https://www.zephyrproject.org/tech-talks
.. _Slack Invite: https://tinyurl.com/y5glwylp
.. _supported boards: http://docs.zephyrproject.org/latest/boards/index.html
.. _Zephyr Documentation: http://docs.zephyrproject.org
.. _Introduction to Zephyr: http://docs.zephyrproject.org/latest/introduction/index.html
.. _Getting Started Guide: http://docs.zephyrproject.org/latest/getting_started/index.html
.. _Contribution Guide: http://docs.zephyrproject.org/latest/contribute/index.html
.. _Zephyr GitHub wiki: https://github.com/zephyrproject-rtos/zephyr/wiki
.. _Zephyr Development mailing list: https://lists.zephyrproject.org/g/devel
.. _Zephyr mailing list subgroups: https://lists.zephyrproject.org/g/main/subgroups
.. _Sample and Demo Code Examples: http://docs.zephyrproject.org/latest/samples/index.html
.. _Security: http://docs.zephyrproject.org/latest/security/index.html
.. _Asking for Help Tips: https://docs.zephyrproject.org/latest/guides/getting-help.html

View File

@@ -1,26 +0,0 @@
version = 1
# Declare default license and copyright text for files that typically do not or cannot include them.
[[annotations]]
path = [
# zephyr-keep-sorted-start
"**/*.cfg",
"**/*.cmake",
"**/*.conf",
"**/*.ecl",
"**/*.html",
"**/*.json",
"**/*.rst",
"**/*.yaml",
"**/*.yml",
"**/*_defconfig",
"**/CMakeLists.txt",
"**/Kconfig*",
"doc/requirements.in",
"doc/requirements.txt",
"scripts/requirements-*.in",
"scripts/requirements-*.txt",
# zephyr-keep-sorted-stop
]
SPDX-License-Identifier = "Apache-2.0"
SPDX-FileCopyrightText = "Copyright The Zephyr Project Contributors"

View File

@@ -1 +0,0 @@
0.17.4

View File

@@ -1,5 +1,5 @@
VERSION_MAJOR = 4
VERSION_MINOR = 3
PATCHLEVEL = 99
VERSION_MAJOR = 2
VERSION_MINOR = 5
PATCHLEVEL = 1
VERSION_TWEAK = 0
EXTRAVERSION =
EXTRAVERSION = rc1

View File

@@ -1,8 +1,5 @@
# SPDX-License-Identifier: Apache-2.0
# FIXME: SHADOW_VARS: Remove this once we have enabled -Wshadow globally.
add_compile_options($<TARGET_PROPERTY:compiler,warning_shadow_variables>)
add_definitions(-D__ZEPHYR_SUPERVISOR__)
include_directories(

File diff suppressed because it is too large Load Diff

View File

@@ -12,36 +12,8 @@ zephyr_cc_option(-fno-delete-null-pointer-checks)
zephyr_cc_option_ifdef(CONFIG_ARC_USE_UNALIGNED_MEM_ACCESS -munaligned-access)
if(NOT COMPILER STREQUAL arcmwdt)
if(CONFIG_THREAD_LOCAL_STORAGE)
# Instruct compiler to use proper register as cached thread pointer for thread local storage.
# For ARCv2 the default register is usually not specified - so we need to specify it
# For ARCv3 the register is fixed to r30, so we don't need to specify it
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=26)
else()
# If thread local storage isn't used - we can safely schedule thread pointer register
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=none)
endif()
endif()
# Instruct compiler to use register R26 as thread pointer
# for thread local storage.
zephyr_cc_option_ifdef(CONFIG_THREAD_LOCAL_STORAGE -mtp-regno=26)
add_subdirectory(core)
if(COMPILER STREQUAL arcmwdt)
add_subdirectory(arcmwdt)
if(CONFIG_64BIT)
zephyr_compile_options(-Ml)
# Instruct MWDT assembler not to warn when we load only lower half (32bit) of symbol
# instead of full 64bit address in ASM code. It is valid as we don't support Zephyr
# linkage to high addresses for 64bit ARC platforms.
zephyr_compile_options(-Wa,-offwarn=168)
endif()
endif()
if(CONFIG_ISA_ARCV3 AND CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf64-littlearc64)
elseif(CONFIG_ISA_ARCV3 AND NOT CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc64)
else()
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc)
endif()

View File

@@ -9,133 +9,70 @@ menu "ARC Options"
config ARCH
default "arc"
choice
prompt "ARC core family"
default CPU_ARCEM
config CPU_ARCEM
bool
bool "ARC EM cores"
select CPU_ARCV2
select ATOMIC_OPERATIONS_C
help
This option signifies the use of an ARC EM CPU
config CPU_ARCHS
bool
select ATOMIC_OPERATIONS_BUILTIN
select BARRIER_OPERATIONS_BUILTIN
bool "ARC HS cores"
select CPU_ARCV2
# FIXME: ATOMIC_OPERATIONS_BUILTIN still has some problem in arcmwdt
# toolchain, so choosing ATOMIC_OPERATIONS_C instead.
select ATOMIC_OPERATIONS_C if "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "arcmwdt"
select ATOMIC_OPERATIONS_BUILTIN if "$(ZEPHYR_TOOLCHAIN_VARIANT)" != "arcmwdt"
help
This option signifies the use of an ARC HS CPU
choice
prompt "ARC Instruction Set"
default ISA_ARCV2
config ISA_ARCV2
bool "ARC ISA v2"
select ARCH_HAS_STACK_PROTECTION if ARC_HAS_STACK_CHECKING || (ARC_MPU && ARC_MPU_VER !=2)
select ARCH_HAS_USERSPACE if ARC_MPU
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
help
v2 ISA for the ARC-HS & ARC-EM cores
config ISA_ARCV3
bool "ARC ISA v3"
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
endchoice
if ISA_ARCV2
config CPU_EM4
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 CPU
config CPU_EM4_DMIPS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU
config CPU_EM4_FPUS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with the single-precision
floating-point extension
config CPU_EM4_FPUDA
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with single-precision
floating-point and double assist instructions
config CPU_EM6
bool
select CPU_ARCEM
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC EM6 CPU
config CPU_HS3X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS3x CPU
config CPU_HS4X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an HS4X CPU
endif #ISA_ARCV2
if ISA_ARCV3
config CPU_HS5X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
config CPU_HS6X
bool
select CPU_ARCHS
select 64BIT
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
endif #ISA_ARCV3
config FP_FPU_DA
bool
menu "ARC CPU Options"
menu "ARCv2 Family Options"
config ARC_HAS_ZOL
config CPU_ARCV2
bool
depends on ISA_ARCV2
select ARCH_HAS_STACK_PROTECTION if ARC_HAS_STACK_CHECKING || ARC_MPU
select ARCH_HAS_USERSPACE if ARC_MPU
select USE_SWITCH
select USE_SWITCH_SUPPORTED
default y
help
ARCv2 CPUs have ZOL hardware loop mechanism which the ARCv3 ISA drops.
Architecturally ZOL provides
- LPcc instruction
- LP_COUNT core reg
- LP_START, LP_END aux regs
Disabling this option removes usage of ZOL regs from code
This option signifies the use of a CPU of the ARCv2 family.
config NUM_IRQ_PRIO_LEVELS
int "Number of supported interrupt priority levels"
@@ -159,8 +96,7 @@ config NUM_IRQS
config RGF_NUM_BANKS
int "Number of General Purpose Register Banks"
depends on ARC_FIRQ
depends on NUM_IRQ_PRIO_LEVELS > 1
depends on CPU_ARCV2
range 1 2
default 2
help
@@ -170,16 +106,9 @@ config RGF_NUM_BANKS
If fast interrupts are supported but there is only 1
register bank, the fast interrupt handler must save
and restore general purpose registers.
NOTE: it's required to have more than one interrupt priority level
to use second register bank - otherwise all interrupts will use
same register bank. Such configuration isn't supported in software
and it is not beneficial from the performance point of view.
config ARC_FIRQ
bool "FIRQ enable"
depends on ISA_ARCV2
depends on NUM_IRQ_PRIO_LEVELS > 1
depends on !ARC_HAS_SECURE
default y
help
Fast interrupts are supported (FIRQ). If FIRQ enabled, for interrupts
@@ -187,13 +116,9 @@ config ARC_FIRQ
other regs will be saved according to the number of register bank;
If FIRQ is disabled, the handle of interrupts with highest priority
will be same with other interrupts.
NOTE: we don't allow the configuration with FIRQ enabled and only one
interrupt priority level (so all interrupts are FIRQ). Such
configuration isn't supported in software and it is not beneficial
from the performance point of view.
config ARC_FIRQ_STACK
bool "Separate firq stack"
bool "Enable separate firq stack"
depends on ARC_FIRQ && RGF_NUM_BANKS > 1
help
Use separate stack for FIRQ handing. When the fast irq is also a direct
@@ -208,7 +133,6 @@ config ARC_FIRQ_STACK_SIZE
config ARC_HAS_STACK_CHECKING
bool "ARC has STACK_CHECKING"
depends on ISA_ARCV2
default y
help
ARC is configured with STACK_CHECKING which is a mechanism for
@@ -245,7 +169,7 @@ config ARC_STACK_PROTECTION
prioritized over the MPU-based stack guard.
config ARC_USE_UNALIGNED_MEM_ACCESS
bool "Unaligned access in HW"
bool "Enable unaligned access in HW"
default y if CPU_ARCHS
depends on (CPU_ARCEM && !ARC_HAS_SECURE) || CPU_ARCHS
help
@@ -253,17 +177,20 @@ config ARC_USE_UNALIGNED_MEM_ACCESS
to support unaligned memory access which is then disabled by default.
Enable unaligned access in hardware and make software to use it.
config ARC_CURRENT_THREAD_USE_NO_TLS
bool
select CURRENT_THREAD_USE_NO_TLS
default y if (RGF_NUM_BANKS > 1) || ("$(ZEPHYR_TOOLCHAIN_VARIANT)" = "arcmwdt")
config FAULT_DUMP
int "Fault dump level"
default 2
range 0 2
help
Disable current Thread Local Storage for ARC. For cores with more than one
RGF_NUM_BANKS the parameter is disabled by-default because banks synchronization
requires significant time, and it slows down performance.
ARCMWDT works with TLS pointer in different way then GCC. Optimized access to
TLS pointer via the _current symbol does not provide significant advantages
in case of MetaWare.
Different levels for display information when a fault occurs.
2: The default. Display specific and verbose information. Consumes
the most memory (long strings).
1: Display general and short information. Consumes less memory
(short strings).
0: Off.
config GEN_ISR_TABLES
default y
@@ -274,7 +201,7 @@ config GEN_IRQ_START_VECTOR
config HARVARD
bool "Harvard Architecture"
help
The ARC CPU can be configured to have two buses;
The ARC CPU can be configured to have two busses;
one for instruction fetching and another that serves as a data bus.
config CODE_DENSITY
@@ -283,8 +210,8 @@ config CODE_DENSITY
Enable code density option to get better code density
config ARC_HAS_ACCL_REGS
bool "Reg Pair ACCL:ACCH (FPU and/or MPY > 6 and/or DSP)"
default y if CPU_HS3X || CPU_HS4X || CPU_HS5X || CPU_HS6X
bool "Reg Pair ACCL:ACCH (FPU and/or MPY > 6)"
default y if FPU
help
Depending on the configuration, CPU can contain accumulator reg-pair
(also referred to as r58:r59). These can also be used by gcc as GPR so
@@ -292,7 +219,6 @@ config ARC_HAS_ACCL_REGS
config ARC_HAS_SECURE
bool "ARC has SecureShield"
depends on ISA_ARCV2
select CPU_HAS_TEE
select ARCH_HAS_TRUSTED_EXECUTION
help
@@ -343,22 +269,11 @@ config ARC_NORMAL_FIRMWARE
resources of the ARC processors, and, therefore, it shall avoid
accessing them.
config ARC_VPX_COOPERATIVE_SHARING
bool "Cooperative sharing of ARC VPX vector registers"
select SCHED_CPU_MASK if MP_MAX_NUM_CPUS > 1
help
This option enables the cooperative sharing of the ARC VPX vector
registers. Threads that want to use those registers must successfully
call arc_vpx_lock() before using them, and call arc_vpx_unlock()
when done using them.
source "arch/arc/core/dsp/Kconfig"
menu "ARC MPU Options"
depends on CPU_HAS_MPU
config ARC_MPU_ENABLE
bool "Memory Protection Unit (MPU)"
bool "Enable MPU"
select ARC_MPU
help
Enable MPU
@@ -370,32 +285,9 @@ endmenu
config DCACHE_LINE_SIZE
default 32
config ARC_DCACHE_REGION_OPERATIONS
bool "DCACHE region operations"
depends on CACHE_MANAGEMENT && DCACHE
default n
help
Perform L1 data cache management operations by regions rather than line by line in a loop,
improves performance of cache management operations.
config ARC_SLC
bool "System level cache"
depends on CACHE_MANAGEMENT && DCACHE && (CPU_HS4X || CPU_HS3X)
default n
help
This option enables System Level Cache, and adds SLC support to the data cache management operations.
config ARC_SLC_LINE_SIZE
int "SLC line size"
depends on ARC_SLC
default 128
help
Size in bytes of a CPU system level cache line.
config ARC_EXCEPTION_STACK_SIZE
int "ARC exception handling stack size"
default 768 if !64BIT
default 2048 if 64BIT
default 768
help
Size in bytes of exception handling stack which is at the top of
interrupt stack to get smaller memory footprint because exception
@@ -404,43 +296,13 @@ config ARC_EXCEPTION_STACK_SIZE
endmenu
config ARC_EARLY_SOC_INIT
bool "Make early stage SoC-specific initialization"
config ARC_EXCEPTION_DEBUG
bool "Unhandled exception debugging information"
default n
depends on PRINTK || LOG
help
Call SoC per-core setup code on early stage initialization
(before C runtime initialization). Setup code is called in form of
soc_early_asm_init_percpu assembler macro.
# ARC vector table must be aligned to 1KiB boundary, and will be at the
# start of the ROM region.
config ROM_START_OFFSET
default 0x400 if BOOTLOADER_MCUBOOT
config MAIN_STACK_SIZE
default 4096 if 64BIT
config ISR_STACK_SIZE
default 4096 if 64BIT
config SYSTEM_WORKQUEUE_STACK_SIZE
default 4096 if 64BIT
config IDLE_STACK_SIZE
default 1024 if 64BIT
config IPM_CONSOLE_STACK_SIZE
default 2048 if 64BIT
config TEST_EXTRA_STACK_SIZE
default 2048 if 64BIT
config CMSIS_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_DYNAMIC_STACK_SIZE
default 2048 if 64BIT
Print human-readable information about exception vectors, cause codes,
and parameters, at a cost of code/data size for the human-readable
strings.
endmenu

View File

@@ -1,5 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
if(CONFIG_ARCMWDT_LIBC OR CONFIG_CPP)
zephyr_sources(arcmwdt-dtr-stubs.c)
endif()

Some files were not shown because too many files have changed in this diff Show More