Compare commits

..

103 Commits

Author SHA1 Message Date
David B. Kinder
727f92c5d5 doc: update 1.9 release with RTD theme
We've replaced the "zephyr-docs" theme with a simpler read-the-docs
theme starting with the 1.13 release. This PR retrofits the theme
changes back onto this 1.9 release and ignores attempts to use the
zephyr-docs theme controlled by the make DOC_TAGS option.

Also needed to backport changes to the warning filtering system
implemented in later releases, fixes to how the config options were
generated (genrest.py).

Signed-off-by: David B. Kinder <david.b.kinder@intel.com>
2018-08-10 08:11:41 -07:00
Kumar Gala
144579843e ci: Update to 0.2 docker image
For some reason the 0.1 release of the docker image isn't around, so
lets use 0.2.  This also means updated to SDK v0.9.2.

Signed-off-by: Kumar Gala <kumar.gala@linaro.org>
2018-08-07 08:49:47 -05:00
Carles Cufi
8f73afcdb6 Bluetooth: controller: Generate LE Conn Complete on cancellation
When using the LE Create Connection Cancel command, the controller is
supposed to return a Command Complete first and then an LE Connection
Complete Event after. Since the Link Layer does not generate an event in
this case emulate the behavior in the HCI layer instead.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
72f9f37a2b Bluetooth: controller: Fix missing advDelay in Low Duty Cycle Directed
Fixed the controller implementation for the missing advDelay
for connectable directed advertising events used in a low
duty cycle mode.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
1ffad048e8 Bluetooth: controller: Add missing HCI supp. cmd bits for PHY Update
Added missing HCI Supported Commands bit fields for PHY
Update feature. Also, refactored with missing conditional
compilation for other bit fields.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
c69ae2f4a1 Bluetooth: controller: Fix to use random CRC init value
Fixed controller implementation to use random CRC init value.

Fixes #6204.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
1913a9f5ca Bluetooth: controller: Fix soft latencies in periodic ticker
Fix, add the missing code for the removal of any accumulated
soft latencies or negative drift ticks when scheduling next
interval expiry with added laziness.

Typically a first interval would accumulate soft latencies
and this has to be removed if the interval is rescheduled
with any added laziness (scheduled to the next soft real
time interval).

Example, scan windows block any new scheduling until the end
of the window, adding latencies to any soft real time ticker
expiry which should try to execute as early as possible after
the scan window.

Fixes: #6083

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
7558c32f1e Bluetooth: controller: Fix ticker to use u32_t ticks_slot
ticker timespace reservation can range up to 10.24 seconds,
needing 19-bits to represent in 32KHz clock units. Hence,
fix controller implementation to use u32_t to store ticks
slot values.

Without this fix, the controller is asserting in scan_adv
sample when using continuous scanning with 2 second interval
and window.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
8fac8dfce9 Bluetooth: controller: Fix handling of invalid Ctrl PDU lengths
Fix the controller implementation to handle Control PDUs
with invalid lengths by responding with Unknown Response
PDU.

Fixes LL.TS.5.0.2 conformance tests:
LL/PAC/SLA/BI-01-C [Control PDUs with Invalid Length from
                    Master]
LL/PAC/MAS/BI-01-C [Control PDUs with Invalid Length from
                    Slave]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
cce8bf55db Bluetooth: controller: Add missing PDU struct definitions
Add missing PDU struct definitions in pdu.h file.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
302333c087 Bluetooth: controller: Fix HCI LE Set PHY invalid behavior check
Fix HCI LE Set PHY command for invalid behavior testing for
invalid parameters and unsupported features.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
c9686ab2b8 Bluetooth: controller: Fix redundant length update event
Fix generation of redundant length update event when no
change in effective octets or time.

Fixes LL.TS.5.0.2 conformance tests:
LL/CON/MAS/BV-73-C [Master Data Length Update - Responding
                    to Data Length Update Procedure; LE 1M
                    PHY]
LL/CON/MAS/BV-74-C [Master Data Length Update - Initiating
                    Data Length Update Procedure; LE 1M PHY]
LL/CON/MAS/BV-76-C [Master Data Length Update - Responding
                    to Data Length Update Procedure; LE 2M
                    PHY]
LL/CON/MAS/BV-77-C [Master Data Length Update - Initiating
                    Data Length Update Procedure; LE 2M PHY]
LL/CON/SLA/BV-77-C [Slave Data Length Update - Responding
                    to Data Length Update Procedure; LE 1M
                    PHY]
LL/CON/SLA/BV-78-C [Slave Data Length Update - Initiating
                    Data Length Update Procedure; LE 1M PHY]
LL/CON/SLA/BV-80-C [Slave Data Length Update - Responding
                    to Data Length Update Procedure; LE 2M
                    PHY]
LL/CON/SLA/BV-81-C [Slave Data Length Update - Initiating
                    Data Length Update Procedure; LE 2M PHY]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
58f50cc91d Bluetooth: controller: Fix to use CPR preferred periodicity value
Fixed implementation to use Connection Parameter Request
Procedure Preferred Periodicity value in calculating the new
connection interval used by the master role in Connection
Update Indication.

Fixes LL.TS.5.0.2 conformance tests:
LL/CON/MAS/BV-32-C [Accepting Connection Parameter Request -
                    Preferred_Periodicity]
LL/CON/MAS/BV-33-C [Accepting Connection Parameter Request -
                    Preferred_Periodicity and preferred
                    anchor points]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
c687fb8c58 Bluetooth: controller: Fix to restrict addr set in initiator
Fix controller implementation to restrict HCI LE Set Random
Address command when advertising and/or active scanning
and/or initiator state is enable.

Fixes LL.TS.5.0.2 conformance tests:
LL/CON/INI/BV-01-C [Connection Initiation]
LL/SEC/ADV/BV-01-C [Advertising With Static Address]
LL/SEC/SCN/BV-01-C [Random Address Scanning]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
487ebc2cfb Bluetooth: controller: Fix for CPR with/without Feature Exchange
Fix implementation to support Connection Parameter Request
Procedure initiation with and without use of Feature
Exchange Procedure being performed in a connection.

Fixes LL.TS.5.0.2 conformance tests:
LL/CON/MAS/BV-81-C [Initiating Connection Parameter Request
                    - Unsupported Without Feature Exchange]
LL/CON/MAS/BV-82-C [Initiating Connection Parameter Request
                    - Unsupported With Feature Exchange]
LL/CON/SLA/BV-85-C [Initiating Connection Parameter Request
                    - Unsupported Without Feature Exchange]
LL/CON/SLA/BV-86-C [Initiating Connection Parameter Request
                    - Unsupported With Feature Exchange]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
aaf9850576 Bluetooth: controller: Fix to restrict addr set in active scan
Fix controller implementation to only restrict HCI LE Set
Random Address command when advertising and/or active
scanning is enable.

Continues to pass the following LL.TS.5.0.2 conformance
test:
LL/SEC/SCN/BV-01-C [Random Address Scanning]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
6e2b996ee4 Bluetooth: controller: Fix to restrict addr set in active states
Fixed implementation to disallow setting Bluetooth device
address under active advertising or scanning states.

Fixes LL.TS.5.0.2 conformance tests:
LL/CON/INI/BV-01-C [Connection Initiation]
LL/SEC/ADV/BV-01-C [Advertising With Static Address]
LL/SEC/SCN/BV-01-C [Random Address Scanning]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
2cf4e4ce91 Bluetooth: controller: Fix CPR initiation while in Enc. setup
Fixes a bug where in Connection Parameter Request was
initiated by slave role while Encryption Setup had been
started by the peer master.

Fixes: #5823

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
e14188ecac Bluetooth: controller: Fix conn param req initiation check
Fixed the check related to initiating connection parameter
request procedure. This will avoid sending invalid repeated
dispatch of connection parameter request PDU.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
d900daf28d Bluetooth: controller: Fix missing CPR procedure reset
When connection parameter request procedure was responded
by master role with Unsupported Link Layer Parameter Value,
a missing reset of the connection parameter request
procedure state caused next connection parameter request to
be incorrectly responded with same procedure collision
extended reject ind PDU. This caused an eventual connection
disconnection with reason LMP response timeout. This is now
fixed by reseting the state correctly.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Vinayak Kariappa Chettimada
039805224c Bluetooth: controller: Fix compile error when LE_ENC is disabled
Fixes the following compile error when CONFIG_BT_CTLR_LE_ENC
is disabled:
subsys/bluetooth/controller/ll_sw/ctrl.c: In function
'isr_rx_conn_pkt_ctrl':
subsys/bluetooth/controller/ll_sw/ctrl.c:2613:29: error:
'LLCP_ENCRYPTION' undeclared (first use in this function)
         (conn->llcp_type != LLCP_ENCRYPTION)) ||
                             ^~~~~~~~~~~~~~~
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2018-04-30 07:36:09 -04:00
Anas Nashif
4c52b42b13 doc: add a new template variable
Signify if the documentation is for a release or if it is the
development version from master.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-11-19 10:30:48 -05:00
Anas Nashif
a7dcd9d827 release: Zephyr 1.9.2
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-11-19 08:20:47 -05:00
Carles Cufi
2741498afc doc: 1.9.x release notes
Release notes for the following Zephyr kernel versions:

* 1.9.1
* 1.9.2

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-11-19 08:20:47 -05:00
Vinayak Kariappa Chettimada
44e8b69ecd Bluetooth: controller: Fix diff proc collision with enc proc
Fixes the following conformance test regression failure
introduced in commit 7dd5fbee26 ("Bluetooth: controller:
Fix MIC error due to parallel Enc Proc")

TP/CON/MAS/BV-28-C [Initiating Connection Parameter Request
different procedure collision encryption]

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-11-02 18:03:39 -04:00
Vinayak Kariappa Chettimada
19f7a6f3b9 Bluetooth: controller: Fix MIC error due to parallel Enc Proc
Fix to disallow initiating LE Start Encryption while another
procedure is in progress. Similarly, disallow initiating
another procedure while Encryption procedure is in progress.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-11-02 18:03:39 -04:00
Vinayak Kariappa Chettimada
4aaf544e8c Bluetooth: controller: Fix to enable Asym PHY on nRF52 Series
Fix the controller Kconfig to enable use of fast radio ramp
up by default, hence enabling support for Asym PHY updates
by default on nRF52 Series SoCs.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-30 14:40:13 -04:00
Vinayak Kariappa Chettimada
8d9cd870e4 Bluetooth: controller: Fix Ctrl PDU Tx starvation assert
Replace all controller asserts in control procedure responses
that checked for buffer availability with an implementation
that nacks request PDUs if there are no buffer to prepare
response PDUs.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-30 14:40:13 -04:00
Vinayak Kariappa Chettimada
ed94d2adb7 Bluetooth: controller: Remove assert on invalid LL id
Remove an assert on receiving invalid LL id, drop these
invalid PDUs.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-30 14:40:13 -04:00
Johan Hedberg
4b98d72156 Bluetooth: Mesh: Fix missing feature bits
The feature bits for Proxy and Friend were missing in the composition
data and heart beat messages.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Steve Brown
46ffe456f0 Bluetooth: Mesh: heartbeat fixes for message count
Both count and period must be non-zero for message publication
Stop publication when count becomes zero
Add count to debug message in hb_publish

Signed-off-by: Steve Brown <sbrown@cortland.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
218dd04b3e Bluetooth: Mesh: Fix mod sub status parameters upon failure
Mesh Profile Specification v1.0, 4.4.1.2.8:

"When an element receives a Config Model Subscription Add message
or a Config Model Subscription Virtual Address Add message that
is not successfully processed (i.e., it results in an error condition
listed in Table 4.113), it shall respond with the Config Model
Subscription Status message, setting its fields to the values
of the corresponding fields (i.e., the identically named fields)
of the incoming message and setting the Status field to a status code
(defined in Table 4.113), and setting all other fields to 0."

The same applies to other Model Subscription messages.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
b8bbd7439d Bluetooth: Mesh: Fix potential access to uninitialized variable
Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
35c025fc07 Bluetooth: Mesh: Fix missing initialization of bt_mesh.local_queue
The local_queue was never being initialized.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
439fbddbb0 Bluetooth: Mesh: Fix revoking app keys
The needed code for taking updated app keys into use and revoking the
old ones was missing.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
c67f64ed36 Bluetooth: Mesh: Fix dropping valid proxy configration messages
Proxy configuration messages are allowed (in fact required) to use
unassigned addresses, so they should be exempt from this check.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
d68d799057 Bluetooth: Mesh: Don't send health status messages if a test fails
The test failure may be e.g. because of an unknown company id, and in
that case the spec expects us to ignore the message.

With this patch it should be possible to pass MESH/SR/HM/RFS/BI-01-C.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
bda57337f2 Bluetooth: Mesh: Fix setting health period divider
A previous patch which moved dispatching the health publish callback
to a later moment introduced a regression where the period divider
does not get updated when it should. In fact, having the divider as
part of the Health Server context is redundant, since the same
information is already stored generically in the model publication
context. Switching to using the model publication context makes things
simpler and ensures that the value is always up-to-date.

With this patch it is possible to pass MESH/SR/HM/CFS/BV-02-C.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
3e2305359e Bluetooth: Mesh: Fix Set LPNTimeout message handling
We should ignore invalid addresses (helps pass
MESH/NODE/CFG/LPNPT/BI-01-C). Also fix a copy-paste issue in an error
log.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
a23d390225 Bluetooth: Mesh: Fix resetting heartbeat subscription expiry properly
Set the value clearly to 0 instead of letting the old expiry time stay
around.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
2c86d827ec Bluetooth: Mesh: Fix zeroing heartbeat state
The values all need to be zeroed when heartbeat subscription is
disabled. This makes it possible to pass MESH/NODE/CFG/HBS/BV-01-C.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
84286b67fa Bluetooth: Mesh: Fix spelling of "heartbeat"
Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
9199844d5b Bluetooth: Mesh: Fix encoding fault count to Health Current Status
There was a missing adjustment to buf->len after fetching the faults
from the app.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
f7279ea0fb Bluetooth: Mesh: Fix Health Period Set OpCode
This was a copy-paste mistake.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
271da26813 Bluetooth: Mesh: Remove local network interface Kconfig option
From section 3.4.5.3 in the Mesh Profile Specification 1.0:

    "A node shall implement a Local Network Interface."

Removing the Kconfig option also helps clean up quite a lot of code.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
061e8096be Bluetooth: Mesh: Allow TTL <= 1 for the local net interface
The bt_mesh_net_relay() function needs to allow TTL <= 1 for the local
network interface since that's the code path that locally originated
outgoing packets take.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
06c1d01a4d Bluetooth: Mesh: Send ack for every message with matching FCS
Mesh Profile Specification v1.0, 5.3.3:

"On the PB-ADV bearer, when the receiver has received all segments of
a transaction, the receiver shall calculate the FCS for the received
Provisioning PDU, and if it matches the FCS field in the Transaction
Start PDU, it shall send a Transaction Acknowledgment PDU after
a random delay between 20 and 50 milliseconds."

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
1f689c6f2e Bluetooth: Mesh: Always set company ID in health status message
The Mesh specification recommends defaulting to the company ID in the
composition data when no other ID is relevant (e.g. in error cases or
if the app has not provided a callback).

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
bddd8f4b37 Bluetooth: Mesh: Set timer for periodic publish before publishing
Encrypting and sending a message takes a considerable amount of time
which makes the publication period longer than expected.

With this patch it is possible to pass MESH/SR/HM/CFS/BV-02-C test.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
6e2eba45d0 Bluetooth: Mesh: Fix encoding health status when app has no callback
The branch for handling the case when the app has not provided a
callback for health faults was encoding the payload in a wrong way.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
14c96b17a2 Bluetooth: Mesh: Fix copy-paste mistake when assigning to cfg->frnd
The right value is BT_MESH_FRIEND_NOT_SUPPORTED.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
a78b466c1e Bluetooth: Mesh: Drop invalid destination addresses
This is required to pass certain PTS tests.

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
971080ca44 Bluetooth: Mesh: Fix SDU length check
The code was passing the wrong first parameter to the sdu_len_is_ok()
function.

Fixes #3985
Fixes #3984

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Johan Hedberg
92f5487d07 Bluetooth: Mesh: Fix string signedness issue
Some compilers (like icx) may complain about unsigned vs signed
strings.

Fixes #3985

Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2017-10-19 05:34:48 -04:00
Luiz Augusto von Dentz
051a5406ae tests: fifo: Add prj_poll.conf
This enables testing k_fifo test with CONFIG_POLL=y.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2017-10-18 16:38:57 -04:00
Luiz Augusto von Dentz
49ead11a00 queue: k_queue_cancel_wait: Fix not interrupting other threads
When k_poll is being used k_queue_cancel_wait shall mark the state as
K_POLL_STATE_NOT_READY so other threads will get properly notified with
a NULL pointer return.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2017-10-18 16:38:57 -04:00
Luiz Augusto von Dentz
0cc21ad9ba poll: k_poll: Return -EINTR if not ready
In case _handle_obj_poll_events is called with K_POLL_STATE_NOT_READY
set -EINTR as return to the poller thread.

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2017-10-18 16:38:57 -04:00
Luiz Augusto von Dentz
ea6cf668c0 queue: k_queue_get: Fix NULL return
k_queue_get shall never return NULL when timeout is K_FOREVER which can
happen when a higher priority thread cancel/take an item before the
waiting thread.

Fixes issue #4358

Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2017-10-18 16:38:57 -04:00
Paul Sokolovsky
2341411c39 kernel: queue: k_queue_poll: Fix slist access race condition
All sys_slist_*() functions aren't threadsafe and calls to them
must be protected with irq_lock. This is usually done in a wider
caller context, but k_queue_poll() is called with irq_lock already
relinquished, and is thus subject to hard to detect and explain
race conditions, as e.g. was tracked in #4022.

Signed-off-by: Paul Sokolovsky <paul.sokolovsky@linaro.org>
2017-10-18 16:38:57 -04:00
Carles Cufi
4195177a9b Bluetooth: controller: Fix flow control packet drop
The main purpose of recv_thread is to process incoming events from the
radio and also any buffered items waiting to be dispatched to the Host
and that are pending because of lack of Host buffers.
When an iteration of the recv_thread obtains a element from the radio it
needs to process it immediately, either sending it straight away to the
Host or appending it to the queue. This was not the case before this
patch, where the concurrency of a buffered packet with one coming from
the radio would cause the latter to be "dropped", causing missing
packets.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Carles Cufi
d3e9ade200 Bluetooth: controller: Fix Controller to Host flow control leak
When a connection is disconnected with outstanding unacked packets, the
Host has no way to signal or acknowledge their processing to the
Controller, since it is illegal to send a Host Number of Completed
Packets command when the connection is not up. Instead, consider the
outstanding packets as acked in order not to affect the correct flow
control.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
38609c09d1 Bluetooth: controller: Add Coded PHY packet tx time restrictions
Add implementation to support Coded PHY update procedure
with packet transmit time restrictions.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
49710776a5 Bluetooth: controller: Add encrypted Coded PHY support
Add support for encrypted Coded PHY connections on nRF52840
SoCs.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
c9234cc0ec Bluetooth: controller: Fix PA/LNA for Coded PHY
Use S8 coding Rx chain delay timings to calculate the PA
pin assertions when in Coded PHY.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
1885ebe67a Bluetooth: controller: Fix tIFS calc for Coded PHY
Always use S8 Rx Chain Delay instead of the actual Rx-ed
packet coding. I believe, as the packet always start with
S8, hence S8 timings when used the tIFS is near correct
value.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
5f6ee6d6b1 Bluetooth: controller: Fix Coded PHY supervision timeout
When calculating and setting up the header compelte timeout
use S8 coding Rx chain delay.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
cbb7568aec Bluetooth: controller: Fix connection update supervision timeout
In the commit dd52b8ea02 ("Bluetooth: controller: Fix
first connection interval timing"), instead of using just a
tick unit as workaround, microseconds corresponding to a
tick unit was used while calculating the window offset to be
used at the connection update instant. This introduced an
error in scheduling the first event with new connection
parameters, causing supervision timeout of connection update
procedure.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
20ef2f5faf Bluetooth: controller: Fix missing reset of FC feature
Fixed a missing reset of FC feature on HCI reset. This
feature provided a simple connection handle based event
exclusions, but this is no longer needed with the
support for controller to host flow control. This feature
should be removed in the future.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
6017e2db35 Bluetooth: controller: Fix missing PHY update procedure reset
When a peer master performed a PHY update procedure with no
change, the state machine was not released. This blocked
any future local initiation of the procedure and also
leading to termination of connection with reason LMP
response timeout.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
9d836af286 Bluetooth: controller: Fix NRF_AAR use
Fixed the usage of NRF_AAR peripheral for controller privacy
to clear events on configure and on every radio ISR entry.

Without this fix, there was spurious AAR matches leading to
controller asserts.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
827ddbf09c Bluetooth: controller: Fix PHY Update response state transition
PHY Update procedure timeout was started without transition
to the state that waits for the procedure to complete. This
prevented the timeout from being reset on successful
completion of the procedure and eventually leading to a
connection termination with reason LMP Response Timeout.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
1704a7c915 Bluetooth: controller: Fix CPR procedure's Conn Upd initiation
Fix Connection Parameter Request Procedure's Connection
Update Procedure initiation to calculate the offset rather
than selecting offsets from an out-of-bound memory area.

The symptoms of the bug was noticed as a supervision timeout
due to use of incorrect offset communicated to peer and a
wrong offset used in scheduling the connection events.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
041ad3d426 Bluetooth: controller: Fix Conn Param Req response timeout
When the peer slave rejects a Connection Parameter Request
Procedure, the controller proceeds to perform a Connection
Update Procedure without clearing the procedure timer that
causes the connection to terminate eventually. This is
fixed by clearing the procedure timeout when the Connection
Update Procedure completes.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Vinayak Kariappa Chettimada
cf3b3b93ca Bluetooth: controller: Fix slave from initiating conn upd ind
If a peer master role has support for Connection Parameter
Request Procedure set in its supported features but would
send an Extended Reject Ind as response to the procedure
then the controller incorrectly initiated a Connection
Update Procedure which is not permitted in a slave role.

This would lead to connection timeout after the used instant
in the invalid Connection Update Procedure.

This is fixed by initiating a Connection Update Procedure
only if in a master role.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-10-18 16:38:57 -04:00
Anas Nashif
1c0368764c ci: compliance: decode output to utf8
Fixes GH-1580.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-12 22:00:11 -05:00
Anas Nashif
ab35052b89 Zephyr 1.9.1
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-11 22:37:44 -04:00
Anas Nashif
b19f73de1d ci: compliance script should use python3
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-06 11:49:38 -04:00
Anas Nashif
be823dcffb ci: build commits, not only PR
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-06 11:49:38 -04:00
Anas Nashif
d7070b77eb doc: update release notes with ARC details
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-06 09:36:48 -04:00
Anas Nashif
d3831de32d doc: update release notes index with 1.9
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2017-10-06 09:36:48 -04:00
Jukka Rissanen
92ce3bc2d6 net: dns: Do not resolve IPv6 address if IPv6 is disabled
If IPv6 is disabled, then it is useless to try to resolve
IPv6 address because "struct sockaddr" does not have enough
space to store IPv6 address.

Fixes #1487

Signed-off-by: Jukka Rissanen <jukka.rissanen@linux.intel.com>
2017-10-06 09:36:48 -04:00
Carles Cufi
b407d02637 Bluetooth: controller: Common config for VS extensions
Since the Zephyr HCI VS extensions apply to both the Host (using them
for additional functionality) and the Controller (implement the commands
and events), it make sense to make this a common setting in order for it
to be configurable in a way that applies to both.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-10-03 19:30:03 -04:00
Carles Cufi
416945ae75 Bluetooth: controller: Disable PA/LNA for nRF51x
The PA/LNA feature is not functional on nRF51x series due to added
interrupt latency. Disable this feature unconditionally for those ICs to
avoid unexpected behavior.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-10-03 19:30:03 -04:00
Vinayak Kariappa Chettimada
89689e0286 Bluetooth: controller: Fix HCI Reset hang
Issuing HCI reset command while having connections sometimes
hung the controller.

ll_reset supplied invalid stop ticker id to role_disable
when trying to stop all connections. Connection role does
not utilize stop ticker. The invalid ticker id supplied
referenced memory outside the pool of tickers and based on
what the content is in RAM there, the controller would hang
trying to stop connections.

Fixed by not calling the ticker_stop interface with invalid
ticker ids.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
91f4c387ae Bluetooth: controller: GPIO PA/LNA feature
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
d2c4dbe6d7 Bluetooth: controller: Add PA/LNA GPIO Kconfig option
Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
ec53508b06 Bluetooth: controller: Implement Read Build Info cmd
Implement the Read Build Information VS command. This returns a UTF-8
encoded string, which is extendable by the user via a new Kconfig
option.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
7f65aa8f78 Bluetooth: controller: Implement Write BD_ADDR VS cmd
Implement the Zephyr VS command that allows a Host to write a public
Bluetooth Address to the Controller in order to allow Hosts to provide
their own public Bluetooth addresses.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
705fb8ec26 Bluetooth: controller: Move tIFS s/w switch PPI index up by one
Move the use of tIFS software switching PPI index set up by
one position to make place for use of PA/LNA implementation.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
d42bd89957 Bluetooth: controller: Use single PPI channel for AA capture
Earlier design captured AA twice in the first Rx in a slave
connection event and retained one of the capture until end
of event to calculate drift.

Design updated to use single capture of AA and save the
first AA capture in a slave connection event in RAM instead.

This frees up a PPI channel in the controller design.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
eebdb3b693 Bluetooth: controller: Internal document purpose of end capture
Document internal the purposes of various Tx/Rx PDU end
capture setup.

Also, removed any redundant capture of packet end.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
4d01527866 Bluetooth: controller: Minor refactor radio_tmr_start
Minor refactor of radio_tmr_start to reduced duplicate
assignments common in if-then-else control path.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
60b37e46ca Bluetooth: controller: Map debug pins to P3 pin head on nRF5x DK
Updated debug pin mapping so that the outputs are on P3 pin
head on all nRF5x Development Kits.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
7ee9fb5ced Bluetooth: controller: Add radio setup HAL interface
Added a new interface to perform setup of radio hardware.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
2d08b180b5 Bluetooth: controller: Fix NRF_CCM config for 2M PHY
Fixed the configuration of NRF_CCM for 2M PHY connections.
Now faster 2M data rate mode will be used when a connection
is in 2M PHY.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
8136c0913f Bluetooth: controller: Use correct NRF_AAR enable define
Use correct NRF_AAR enable macro defines from Nordic MDK.
Old code funtionally worked fine even though not setting
the correct enable value.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
59d10ea93a Bluetooth: controller: Remove unreachable workaround code
A workaround code for nRF52840 under nRF51 conditional
compilation is removed.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
c86b716481 Bluetooth: controller: Fix conn param req procedure timeout
Fixed a bug in the implementation of Connection Parameter
Request Procedure when initiated in master role caused the
connection to terminate with reason LL response timeout.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
ef7a6c485f Bluetooth: controller: Fix handling of Read Static Addrs cmd
The status in the Command Complete event was uninitialized, leading to
incorrect contents of the event parsed by the Host. Correctly initialize
the status to success.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Vinayak Kariappa Chettimada
66817cbac9 Bluetooth: controller: Fix hang on directed adv timeout
During testing it was discovered that directed advertising
timeout is missing implementation to handle the timeout
happening while next event is already in preparation.
The consequence was that after the event ticker expired,
the counter is shutdown, stalling the setup PPI from
starting the erroneous advertising, leaving the controller
in an invalid hung state.

This has been fixed by correctly handling the cases, stop
between prepare and event, and stop inside radio advertising
event. The fix takes care of putting the radio active
callback and HF clock in the correct states.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
543acefc43 samples: bluetooth: hci: Fix TX memory leak
Whenever a buffer is sent to the driver via bt_raw using bt_send() the
buffer might not be consumed if an error is returned. In that case
unreference the buffer to avoid leaking the already allocated net_buf.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Carles Cufi
f913f9b9f0 Bluetooth: controller: Issue Data Buffer Overflow event
Whenever the HCI ACL flow control is violated by the Host, a Data Buffer
Overflow event is now issued by the Controller (if enabled) to notify
the Host of the buffer overrun.

Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2017-09-30 09:57:38 -04:00
Ricardo Salveti
3a4fb6e24e pwm: stm32: Fix check for APB prescale
RCC_HCLK_DIV1 translates to 0x0 while apb_psc uses the value defined
by CONFIG_CLOCK_STM32_APB1/2_PRESCALER (range from 1 to 16).

Manually check if the defined prescaler is 1 or not and use that to
calculate the correct timer clock.

Signed-off-by: Ricardo Salveti <ricardo.salveti@linaro.org>
2017-09-13 10:44:23 -05:00
48280 changed files with 3753010 additions and 4833888 deletions

View File

@@ -1,31 +1,20 @@
--mailback
--no-tree
--emacs
--summary-file
--show-types
--max-line-length=100
--max-line-length=80
--min-conf-desc-length=1
--typedefsfile=scripts/checkpatch/typedefsfile
--ignore BRACES
--ignore PRINTK_WITHOUT_KERN_LEVEL
--ignore SPLIT_STRING
--ignore VOLATILE
--ignore CONFIG_EXPERIMENTAL
--ignore PREFER_KERNEL_TYPES
--ignore PREFER_SECTION
--ignore AVOID_EXTERNS
--ignore NETWORKING_BLOCK_COMMENT_STYLE
--ignore DATE_TIME
--ignore MINMAX
--ignore CONST_STRUCT
--ignore FILE_PATH_CHANGES
--ignore SPDX_LICENSE_TAG
--ignore C99_COMMENT_TOLERANCE
--ignore REPEATED_WORD
--ignore UNDOCUMENTED_DT_STRING
--ignore DT_SPLIT_BINDING_PATCH
--ignore DT_SCHEMA_BINDING_PATCH
--ignore TRAILING_SEMICOLON
--ignore COMPLEX_MACRO
--ignore MULTISTATEMENT_MACRO_USE_DO_WHILE
--ignore ENOSYS
--ignore IS_ENABLED_CONFIG
--ignore EXPORT_SYMBOL
--exclude ext

View File

@@ -1,114 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Note: The list of ForEachMacros can be obtained using:
#
# git grep -h '^#define [^[:space:]]*FOR_EACH[^[:space:]]*(' include/ \
# | sed "s,^#define \([^[:space:]]*FOR_EACH[^[:space:]]*\)(.*$, - '\1'," \
# | sort | uniq
#
# References:
# - https://clang.llvm.org/docs/ClangFormatStyleOptions.html
---
BasedOnStyle: LLVM
AlignConsecutiveMacros: AcrossComments
AllowShortBlocksOnASingleLine: Never
AllowShortCaseLabelsOnASingleLine: false
AllowShortEnumsOnASingleLine: false
AllowShortFunctionsOnASingleLine: None
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AttributeMacros:
- __aligned
- __deprecated
- __packed
- __printf_like
- __syscall
- __syscall_always_inline
- __subsystem
BitFieldColonSpacing: After
BreakBeforeBraces: Linux
ColumnLimit: 100
ConstructorInitializerIndentWidth: 8
ContinuationIndentWidth: 8
ForEachMacros:
- 'ARRAY_FOR_EACH'
- 'ARRAY_FOR_EACH_PTR'
- 'FOR_EACH'
- 'FOR_EACH_FIXED_ARG'
- 'FOR_EACH_IDX'
- 'FOR_EACH_IDX_FIXED_ARG'
- 'FOR_EACH_NONEMPTY_TERM'
- 'FOR_EACH_FIXED_ARG_NONEMPTY_TERM'
- 'RB_FOR_EACH'
- 'RB_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_DLIST_FOR_EACH_NODE'
- 'SYS_DLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SEM_LOCK'
- 'SYS_SFLIST_FOR_EACH_CONTAINER'
- 'SYS_SFLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SFLIST_FOR_EACH_NODE'
- 'SYS_SFLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SLIST_FOR_EACH_CONTAINER'
- 'SYS_SLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SLIST_FOR_EACH_NODE'
- 'SYS_SLIST_FOR_EACH_NODE_SAFE'
- '_WAIT_Q_FOR_EACH'
- 'Z_FOR_EACH'
- 'Z_FOR_EACH_ENGINE'
- 'Z_FOR_EACH_EXEC'
- 'Z_FOR_EACH_FIXED_ARG'
- 'Z_FOR_EACH_FIXED_ARG_EXEC'
- 'Z_FOR_EACH_IDX'
- 'Z_FOR_EACH_IDX_EXEC'
- 'Z_FOR_EACH_IDX_FIXED_ARG'
- 'Z_FOR_EACH_IDX_FIXED_ARG_EXEC'
- 'Z_GENLIST_FOR_EACH_CONTAINER'
- 'Z_GENLIST_FOR_EACH_CONTAINER_SAFE'
- 'Z_GENLIST_FOR_EACH_NODE'
- 'Z_GENLIST_FOR_EACH_NODE_SAFE'
- 'STRUCT_SECTION_FOREACH'
- 'STRUCT_SECTION_FOREACH_ALTERNATE'
- 'TYPE_SECTION_FOREACH'
- 'K_SPINLOCK'
- 'COAP_RESOURCE_FOREACH'
- 'COAP_SERVICE_FOREACH'
- 'COAP_SERVICE_FOREACH_RESOURCE'
- 'HTTP_RESOURCE_FOREACH'
- 'HTTP_SERVER_CONTENT_TYPE_FOREACH'
- 'HTTP_SERVICE_FOREACH'
- 'HTTP_SERVICE_FOREACH_RESOURCE'
- 'I3C_BUS_FOR_EACH_I3CDEV'
- 'I3C_BUS_FOR_EACH_I2CDEV'
IfMacros:
- 'CHECKIF'
# Disabled for now, see bug https://github.com/zephyrproject-rtos/zephyr/issues/48520
#IncludeBlocks: Regroup
IncludeCategories:
- Regex: '^".*\.h"$'
Priority: 0
- Regex: '^<(assert|complex|ctype|errno|fenv|float|inttypes|limits|locale|math|setjmp|signal|stdarg|stdbool|stddef|stdint|stdio|stdlib|string|tgmath|time|wchar|wctype)\.h>$'
Priority: 1
- Regex: '^\<zephyr/.*\.h\>$'
Priority: 2
- Regex: '.*'
Priority: 3
IndentCaseLabels: false
IndentGotoLabels: false
IndentWidth: 8
InsertBraces: true
SpaceBeforeInheritanceColon: False
SpaceBeforeParens: ControlStatementsExceptControlMacros
SortIncludes: Never
UseTab: ForContinuationAndIndentation
WhitespaceSensitiveMacros:
- COND_CODE_0
- COND_CODE_1
- IF_DISABLED
- IF_ENABLED
- LISTIFY
- STRINGIFY
- Z_STRINGIFY
- DT_FOREACH_PROP_ELEM_SEP

View File

@@ -1,21 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Copyright (c) 2024, Basalte bv
analyzer:
# Start by disabling all
- --disable-all
# Enable the sensitive profile
- --enable=sensitive
# Disable unused cases
- --disable=boost
- --disable=mpi
# Many identifiers in zephyr start with _
- --disable=clang-diagnostic-reserved-identifier
- --disable=clang-diagnostic-reserved-macro-identifier
# Cleanup
- --clean

View File

@@ -1,31 +0,0 @@
codecov:
notify:
require_ci_to_pass: yes
coverage:
precision: 2
round: down
range: "70...100"
status:
project: yes
patch: yes
changes: no
# ignore:
# - "tests/**/*"
# - "samples/**/*"
# - "ext/hal/**/*"
parsers:
gcov:
branch_detection:
conditional: yes
loop: yes
method: no
macro: no
comment:
layout: "reach, diff, flags, files, footer"
behavior: default
require_changes: no

View File

@@ -1,92 +0,0 @@
# EditorConfig: https://editorconfig.org/
# top-most EditorConfig file
root = true
# All (Defaults)
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
max_line_length = 100
# Assembly
[*.S]
indent_style = tab
indent_size = 8
# C
[*.{c,h}]
indent_style = tab
indent_size = 8
# C++
[*.{cpp,hpp}]
indent_style = tab
indent_size = 8
# Linker Script
[*.ld]
indent_style = tab
indent_size = 8
# Python
[*.py]
indent_style = space
indent_size = 4
# Perl
[*.pl]
indent_style = tab
indent_size = 8
# reStructuredText
[*.rst]
indent_style = space
indent_size = 3
# YAML
[*.{yml,yaml}]
indent_style = space
indent_size = 2
# Shell Script
[*.sh]
indent_style = space
indent_size = 4
# Windows Command Script
[*.cmd]
end_of_line = crlf
indent_style = tab
indent_size = 8
# Valgrind Suppression File
[*.supp]
indent_style = space
indent_size = 3
# CMake
[{CMakeLists.txt,*.cmake}]
indent_style = space
indent_size = 2
# Makefile
[Makefile]
indent_style = tab
indent_size = 8
# Device tree
[*.{dts,dtsi,overlay}]
indent_style = tab
indent_size = 8
# Git commit messages
[COMMIT_EDITMSG]
max_line_length = 75
# Kconfig
[Kconfig*]
indent_style = tab
indent_size = 8

8
.gitattributes vendored
View File

@@ -3,11 +3,3 @@
.gitattributes export-ignore
.gitignore export-ignore
.mailmap export-ignore
# Tell git to not diff certain files
*.svg -diff
# Tell linguist that generated test pattern files should not be included in the
# language statistics.
*.pat linguist-generated
*.svg linguist-generated

View File

@@ -1,71 +0,0 @@
---
name: Bug report
about: Create a report to help us improve Zephyr
title: ''
labels: bug
assignees: ''
---
<!--
**Notes**
Github Discussions (https://github.com/zephyrproject-rtos/zephyr/discussions)
are available to first verify that the issue is a genuine Zephyr bug and not a
consequence of Zephyr services misuse.
This issue list is only for bugs in the main Zephyr code base
(https://github.com/zephyrproject-rtos/zephyr/). If the bug is for a project
fork (such as NCS) specific feature, please open an issue in the fork project
instead.
-->
**Describe the bug**
<!--
A clear and concise description of what the bug is.
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
- ...
-->
**To Reproduce**
<!--
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
-->
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Impact**
<!--
What impact does this issue have on your progress (e.g., annoyance, showstopper)
-->
**Logs and console output**
<!--
If applicable, add console logs or other types of debug information
e.g Wireshark capture or Logic analyzer capture (upload in zip archive).
copy-and-paste text and put a code fence (\`\`\`) before and after, to help
explain the issue. (if unable to obtain text log, add a screenshot)
-->
**Environment (please complete the following information):**
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
**Additional context**
<!--
Add any other context that could be relevant to your issue, such as pin setting,
target configuration, ...
-->

View File

@@ -1,28 +0,0 @@
---
name: Enhancement
about: Suggest enhancements to existing features
title: ''
labels: Enhancement
assignees: ''
---
**Is your enhancement proposal related to a problem? Please describe.**
<!--
A clear and concise description of what the problem is.
-->
**Describe the solution you'd like**
<!--
A clear and concise description of what you want to happen.
-->
**Describe alternatives you've considered**
<!--
A clear and concise description of any alternative solutions or features you've considered.
-->
**Additional context**
<!--
Add any other context or graphics (drag-and-drop an image) about the feature request here.
-->

View File

@@ -1,60 +0,0 @@
---
name: RFC / Proposal
about: Submit an RFC / Proposal
title: ''
labels: RFC
assignees: ''
---
## Introduction
<!--
This section targets end users, TSC members, maintainers and anyone else that might
need a quick explanation of your proposed change.
-->
### Problem description
<!--
Why do we want this change and what problem are we trying to address?
-->
### Proposed change
<!--
A brief summary of the proposed change - the 10,000 ft view on what it will
change once this change is implemented.
-->
## Detailed RFC
<!--
In this section of the document the target audience is the dev team. Upon
reading this section each engineer should have a rather clear picture of what
needs to be done in order to implement the described feature.
-->
### Proposed change (Detailed)
<!--
This section is freeform - you should describe your change in as much detail
as possible. Please also ensure to include any context or background info here.
For example, do we have existing components which can be reused or altered.
By reading this section, each team member should be able to know what exactly
you're planning to change and how.
-->
### Dependencies
<!--
Highlight how the change may affect the rest of the project (new components,
modifications in other areas), or other teams/projects.
-->
### Concerns and Unresolved Questions
<!--
List any concerns, unknowns, and generally unresolved questions etc.
-->
## Alternatives
<!--
List any alternatives considered, and the reasons for choosing this option
over them.
-->

View File

@@ -1,28 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: Feature Request
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
<!--
A clear and concise description of what the problem is.
-->
**Describe the solution you'd like**
<!--
A clear and concise description of what you want to happen.
-->
**Describe alternatives you've considered**
<!--
A clear and concise description of any alternative solutions or features you've considered.
-->
**Additional context**
<!--
Add any other context or graphics (drag-and-drop an image) about the feature request here.
-->

View File

@@ -1,42 +0,0 @@
---
name: Contributor Nomination
about: Nominate a GitHub user for additional rights on the Zephyr Project
title: ''
labels: Role Nomination
assignees: ''
---
# Background
The [TSC Project Roles] defines the main roles for the Zephyr Project, including
Maintainer, Collaborator, and Contributor.
By default anyone that contributes code or documentation is a Contributor, but
with the lowest [GitHub Permission Level] of Read. For example, Contributors
with Read permission do not have the permission to add reviewers to a pull
request.
Use this template to nominate a GitHub user for the Contributor role with
Triage permission level, which allows the user to add reviewers to a pull
request and be added as a reviewer by other users.
# Nomination
## GitHub User
Provide the following information about the GitHub user:
1. Full Name
1. GitHub username
1. Organization (optional)
## Supporting Documents
Add links to 3-5 GitHub pull requests, in the Zephyr project, authored or
reviewed by the GitHub user that demonstrate the user's dedication to the
Zephyr project.
[TSC Project Roles]: <https://docs.zephyrproject.org/latest/project/project_roles.html>
[GitHub Permission Level]: <https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization>

View File

@@ -1,68 +0,0 @@
---
name: External Source Code
about: Submit a proposal to integrate external source code
title: ''
labels: TSC
assignees: ''
---
## Origin
Name of project hosting the original open source code
Provide a link to the source
## Purpose
Brief description of what this software does
## Mode of integration
Describe whether you'd like to integrate this external component in the main tree
or as a module, and why. If the mode of integration is a module, suggest a
repository name for the module
## Maintainership
List the person(s) that will be maintaining the integration of this external code
for the foreseeable future. Please use GitHub IDs to identify them. You can
choose to identify a single maintainer only or add collaborators as well
## Pull Request
Pull request (if any) with the actual implementation of the integration, be it
in the main tree or as a module (pointing to your own fork for now). Make sure
the PR is correctly labeled as "DNM"
## Description
Long description that will help reviewers discuss suitability of the
component to solve the problem at hand (there may be a better options
available.)
What is its primary functionality (e.g., SQLLite is a lightweight
database)?
What problem are you trying to solve? (e.g., a state store is
required to maintain ...)
Why is this the right component to solve it (e.g., SQLite is small,
easy to use, and has a very liberal license.)
## Dependencies
What other components does this package depend on?
Will the Zephyr project have a direct dependency on the component, or
will it be included via an abstraction layer with this component as a
replaceable implementation?
## Revision
Version or SHA you would like to integrate initially
## License
Please use an SPDX identifier (https://spdx.org/licenses/), such as
``BSD-3-Clause``

View File

@@ -1,41 +0,0 @@
---
name: Binary blobs
about: Submit a proposal to integrate binary blob(s)
title: ''
labels: TSC
assignees: ''
---
## Origin
Describe where the binary blob(s) originate from
## Type
- [ ] Precompiled library
- [ ] Firmware image
## Module
The Zephyr module that this blob(s) will be referenced from
## Purpose
Brief description of what the blob(s) do. It is especially important to describe
the functionality that the blob(s) provide, to the largest extent possible
## Pull Request
Link to the Pull request with the actual implementation of the integration. If
you are submitting this as part of a new module you may link to the same Pull
Request that introduced the new module.
## Dependencies
What other components do the blob(s) depend on, if any?
## License
Document the license the blob(s) are distributed under

22
.github/SECURITY.md vendored
View File

@@ -1,22 +0,0 @@
# Security Policy
## Supported versions
The Zephyr project supports the following versions with security
updates:
- The most recent release, and the release prior to that.
- Active LTS releases.
At this time, with the latest release of v4.0, the supported
versions are:
- v4.0: Current release
- v3.7: Prior release and Current LTS
- v2.7: Prior LTS
## Reporting process
Please see our [Security Vulnerability
Reporting](https://docs.zephyrproject.org/latest/security/reporting.html)
page for details on the process.

View File

@@ -1,13 +0,0 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: github: "
labels: []
groups:
actions-deps:
patterns:
- "*"

View File

@@ -1,16 +0,0 @@
license:
main: apache-2.0
report_missing: true
category: Permissive
copyright:
check: true
exclude:
extensions:
- yml
- yaml
- html
- rst
- conf
- cfg
langs:
- HTML

View File

@@ -1,51 +0,0 @@
name: Pull Request Assigner
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- ready_for_review
branches:
- main
- collab-*
- v*-branch
issues:
types:
- labeled
jobs:
assignment:
name: Pull Request Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-22.04
steps:
- name: Install Python dependencies
run: |
pip install -U PyGithub>=1.55 west
- name: Check out source code
uses: actions/checkout@v4
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"
FLAGS+=" -r ${{ github.event.repository.name }}"
FLAGS+=" -M MAINTAINERS.yml"
if [ "${{ github.event_name }}" = "pull_request_target" ]; then
FLAGS+=" -P ${{ github.event.pull_request.number }}"
elif [ "${{ github.event_name }}" = "issues" ]; then
FLAGS+=" -I ${{ github.event.issue.number }}"
elif [ "${{ github.event_name }}" = "schedule" ]; then
FLAGS+=" --modules"
else
echo "Unknown event: ${{ github.event_name }}"
exit 1
fi
python3 scripts/set_assignees.py $FLAGS

View File

@@ -1,31 +0,0 @@
name: Backport
on:
pull_request_target:
types:
- closed
- labeled
branches:
- main
jobs:
backport:
name: Backport
runs-on: ubuntu-22.04
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
github.event.pull_request.merged &&
(
github.event.action == 'closed' ||
(
github.event.action == 'labeled' &&
contains(github.event.label.name, 'backport')
)
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@v2.0.3-3
with:
github_token: ${{ secrets.ZB_GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

View File

@@ -1,38 +0,0 @@
name: Backport Issue Check
on:
pull_request_target:
types:
- edited
- opened
- reopened
- synchronize
branches:
- v*-branch
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Check out source code
uses: actions/checkout@v4
- name: Install Python dependencies
run: |
pip install -U pygithub
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -1,28 +0,0 @@
name: Publish BabbleSim Tests Results
on:
workflow_run:
workflows: ["BabbleSim Tests"]
types:
- completed
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-22.04
if: github.event.workflow_run.conclusion != 'skipped'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@v8
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: BabbleSim Test Results
comment_mode: off
commit: ${{ github.event.workflow_run.head_sha }}
event_file: event/event.json
event_name: ${{ github.event.workflow_run.event }}
files: "bsim-test-results/**/bsim_results.xml"

View File

@@ -1,201 +0,0 @@
name: BabbleSim Tests
on:
pull_request:
paths:
- ".github/workflows/bsim-tests.yaml"
- ".github/workflows/bsim-tests-publish.yaml"
- "west.yml"
- "subsys/bluetooth/**"
- "tests/bsim/**"
- "boards/nordic/nrf5*/*dt*"
- "dts/*/nordic/**"
- "tests/bluetooth/common/testlib/**"
- "samples/bluetooth/**"
- "boards/native/**"
- "soc/native/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/hal_nordic/**"
- "modules/mbedtls/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
- "include/zephyr/net/openthread.h"
- "drivers/ieee802154/**"
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
bsim-test:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Environment Setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check common triggering files
uses: tj-actions/changed-files@v45
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/native/
soc/native/
arch/posix/
include/zephyr/arch/posix/
scripts/native_simulator/
tests/bsim/*
boards/nordic/nrf5*/*dt*
dts/*/nordic/
modules/mbedtls/**
modules/hal_nordic/**
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@v45
id: check-bluetooth-files
with:
files: |
tests/bsim/bluetooth/
samples/bluetooth/
subsys/bluetooth/
- name: Check if Networking files changed
uses: tj-actions/changed-files@v45
id: check-networking-files
with:
files: |
tests/bsim/net/
samples/net/sockets/echo_*/
modules/openthread/
subsys/net/l2/openthread/
include/zephyr/net/openthread.h
drivers/ieee802154/
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@v45
id: check-uart-files
with:
files: |
tests/bsim/drivers/uart/
drivers/serial/*nrfx*
tests/drivers/uart/
- name: Update BabbleSim to manifest revision
if: >
steps.check-bluetooth-files.outputs.any_modified == 'true'
|| steps.check-networking-files.outputs.any_modified == 'true'
|| steps.check-uart-files.outputs.any_modified == 'true'
|| steps.check-common-files.outputs.any_modified == 'true'
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Bluetooth Tests with BSIM
if: steps.check-bluetooth-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.bt.sh
- name: Run Networking Tests with BSIM
if: steps.check-networking-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.net.sh
- name: Run UART Tests with BSIM
if: steps.check-uart-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.uart.sh
- name: Merge Test Results
run: |
pip install junitparser junit2html
junitparser merge --glob "./bsim_*/*bsim_results.*.xml" "./twister-out/twister.xml" junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@v4
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: Bsim Test Results
files: "junit.xml"
comment_mode: off
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@v4
with:
name: event
path: |
${{ github.event_path }}

View File

@@ -1,66 +0,0 @@
# Copyright (c) 2021, 2022 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Make a snapshot of open bugs as a python pickle file, compressed
# using xz. Upload the xz file to Amazon S3.
name: Bug Snapshot
on:
workflow_dispatch:
branches: [main]
schedule:
# Run daily at 14:05
- cron: '5 14 * * *'
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-22.04
if: github.repository_owner == 'zephyrproject-rtos'
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Python dependencies
run: |
pip install -U pygithub
- name: Snapshot bugs
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BUGS_PICKLE_FILENAME="zephyr-bugs-$(date -I).pickle.xz"
BUGS_PICKLE_PATH="${RUNNER_TEMP}/${BUGS_PICKLE_FILENAME}"
set -euo pipefail
python3 scripts/make_bugs_pickle.py | xz > ${BUGS_PICKLE_PATH}
echo "BUGS_PICKLE_FILENAME=${BUGS_PICKLE_FILENAME}" >> ${GITHUB_ENV}
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
run: |
REPOSITORY_NAME="${GITHUB_REPOSITORY#*/}"
PUBLISH_BUCKET="builds.zephyrproject.org"
PUBLISH_DOMAIN="builds.zephyrproject.io"
if [ "${{github.event_name}}" = "schedule" ]; then
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot/daily"
else
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot"
fi
PUBLISH_UPLOAD_URI="s3://${PUBLISH_BUCKET}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
PUBLISH_DOWNLOAD_URI="https://${PUBLISH_DOMAIN}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
aws s3 cp --quiet ${BUGS_PICKLE_PATH} ${PUBLISH_UPLOAD_URI}
echo "Bug pickle is available at: ${PUBLISH_DOWNLOAD_URI}" >> ${GITHUB_STEP_SUMMARY}

View File

@@ -1,229 +0,0 @@
name: Code Coverage with codecov
on:
schedule:
- cron: '25 06,18 * * *'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
codecov:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["mps2/an385", "native_sim", "qemu_x86", "unit_testing"]
include:
- platform: 'mps2/an385'
normalized: 'mps2_an385'
- platform: 'native_sim'
normalized: 'native_sim'
- platform: 'qemu_x86'
normalized: 'qemu_x86'
- platform: 'unit_testing'
normalized: 'unit_testing'
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resovle the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: west setup
run: |
west init -l . || true
west update 1> west.update.log || west update 1> west.update-2.log
- name: Environment Setup
run: |
cmake --version
gcc --version
ls -la
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
pip install gcovr==6.0
./scripts/twister -E ${{matrix.normalized}}-testplan.json
ls -la
./scripts/twister \
-i --force-color -N -v --filter runnable -p ${{ matrix.platform }} --coverage \
-T tests --coverage-tool gcovr -xCONFIG_TEST_EXTRA_STACK_SIZE=4096 -e nano \
--timeout-multiplier 2
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Rename coverage files
if: always()
run: |
mv twister-out/coverage.json coverage/reports/${{matrix.normalized}}.json
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Coverage Data (Subset ${{ matrix.normalized }})
path: |
coverage/reports/${{ matrix.normalized }}.json
${{ matrix.normalized }}-testplan.json
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-22.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: coverage/reports
- name: Move coverage files
run: |
ls -lRt ./coverage/reports
mv ./coverage/reports/*/*testplan.json .
mv ./coverage/reports/*/coverage/reports/*.json ./coverage/reports
ls -la ./coverage/reports
- name: Generate list of coverage files
id: get-coverage-files
shell: cmake -P {0}
run: |
file(GLOB INPUT_FILES_LIST "coverage/reports/*.json")
set(MERGELIST "")
set(FILELIST "")
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(FILELIST STREQUAL "")
set(FILELIST "${f}")
else()
set(FILELIST "${FILELIST},${f}")
endif()
endforeach()
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(MERGELIST STREQUAL "")
set(MERGELIST "--add-tracefile ${f}")
else()
set(MERGELIST "${MERGELIST} -a ${f}")
endif()
endforeach()
file(APPEND $ENV{GITHUB_OUTPUT} "mergefiles=${MERGELIST}\n")
file(APPEND $ENV{GITHUB_OUTPUT} "covfiles=${FILELIST}\n")
- name: Merge coverage files
run: |
pushd ./coverage/reports
pip install gcovr==6.0
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
popd
- name: Get current date
id: run_date
run: |
echo "run_date=$(date --iso-8601=minutes)" >> "$GITHUB_OUTPUT"
echo "run_date_short=$(date +'%Y-%m-%d')" >> "$GITHUB_OUTPUT"
echo "run_date_year=$(date +'%Y')" >> "$GITHUB_OUTPUT"
echo "run_date_month=$(date +'%m')" >> "$GITHUB_OUTPUT"
- name: Generate Coverage Report
if: always()
run: |
pip install xlsxwriter ijson
python3 ./scripts/ci/coverage/coverage_analysis.py \
-t native_sim-testplan.json \
-m MAINTAINERS.yml \
-c coverage/reports/merged.json \
-o coverage-report-${{ steps.run_date.outputs.run_date_short }} \
-f all
cp coverage-report-* coverage/reports/
- name: Upload Merged Coverage Results and Report
if: always()
uses: actions/upload-artifact@v4
with:
name: Coverage Data and report
path: |
coverage/reports/merged.json
coverage/reports/merged.xml
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.json
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.xlsx
- name: Upload coverage to Codecov
if: always()
uses: codecov/codecov-action@v5
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/merged.xml

View File

@@ -1,61 +0,0 @@
name: Coding Guidelines
on: pull_request
jobs:
compliance_job:
runs-on: ubuntu-22.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/coding_guidelines.yml') }}
- name: Install python dependencies
run: |
pip install unidiff
pip install sh
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guidelines Checks
continue-on-error: true
id: coding_guidelines
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
source zephyr-env.sh
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/guideline_check.py --output output.txt -c origin/${BASE_REF}..
- name: check-warns
run: |
if [[ -s "output.txt" ]]; then
errors=$(cat output.txt)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=output.txt::$errors"
exit 1;
fi

View File

@@ -1,130 +0,0 @@
name: Compliance Checks
on:
pull_request:
types:
- edited
- opened
- reopened
- synchronize
jobs:
check_compliance:
runs-on: ubuntu-22.04
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase onto the target branch
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
git remote -v
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
# debug
git log --pretty=oneline | head -n 10
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/compliance.yml') }}
- name: Install python dependencies
run: |
pip install -r scripts/requirements-compliance.txt
pip install west
- name: west setup
run: |
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Check for PR description
if: ${{ github.event.pull_request.body == '' }}
continue-on-error: true
id: pr_description
run: |
echo "Pull request description cannot be empty."
exit 1
- name: Run Compliance Tests
continue-on-error: true
id: compliance
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
# debug
ls -la
git log --pretty=oneline | head -n 10
# Increase rename limit to allow for large PRs
git config diff.renameLimit 10000
./scripts/ci/check_compliance.py --annotate -e KconfigBasic -e SysbuildKconfigBasic -e ClangFormat \
-c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: compliance.xml
path: compliance.xml
- name: check-warns
run: |
if [[ ! -s "compliance.xml" ]]; then
exit 1;
fi
warns=("ClangFormat")
files=($(./scripts/ci/check_compliance.py -l))
for file in "${files[@]}"; do
f="${file}.txt"
if [[ -s $f ]]; then
results=$(cat $f)
results="${results//'%'/'%25'}"
results="${results//$'\n'/'%0A'}"
results="${results//$'\r'/'%0D'}"
if [[ "${warns[@]}" =~ "${file}" ]]; then
echo "::warning file=${f}::$results"
else
echo "::error file=${f}::$results"
exit=1
fi
fi
done
if [ "${exit}" == "1" ]; then
echo "Compliance error, check for error messages in the \"Run Compliance Tests\" step"
echo "You can run this step locally with the ./scripts/ci/check_compliance.py script."
exit 1;
fi
if [ "${{ steps.pr_description.outcome }}" == "failure" ]; then
echo "PR description cannot be empty"
exit 1;
fi

View File

@@ -1,38 +0,0 @@
# Copyright (c) 2020 Intel Corp.
# SPDX-License-Identifier: Apache-2.0
name: Publish commit for daily testing
on:
schedule:
- cron: '50 22 * * *'
push:
branches:
- refs/tags/*
jobs:
get_version:
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: install-pip
run: |
pip install gitpython
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .
aws s3 cp versions.json s3://testing.zephyrproject.org/daily_tests/versions.json

View File

@@ -1,69 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2020 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Devicetree script tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
jobs:
devicetree-checks:
name: Devicetree script tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install python dependencies
run: |
pip install pytest pyyaml tox
- name: run tox
working-directory: scripts/dts/python-devicetree
run: |
tox

View File

@@ -1,21 +0,0 @@
name: Do Not Merge
on:
pull_request:
types: [synchronize, opened, reopened, labeled, unlabeled]
jobs:
do-not-merge:
name: Prevent Merging
runs-on: ubuntu-22.04
steps:
- name: Check for label
if: ${{ contains(github.event.*.labels.*.name, 'DNM') ||
contains(github.event.*.labels.*.name, 'DNM (manifest)') ||
contains(github.event.*.labels.*.name, 'TSC') ||
contains(github.event.*.labels.*.name, 'Architecture Review') ||
contains(github.event.*.labels.*.name, 'dev-review') }}
run: |
echo "Pull request is labeled as 'DNM', 'TSC', 'Architecture Review' or 'dev-review'."
echo "This workflow fails so that the pull request cannot be merged."
exit 1

View File

@@ -1,264 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Documentation Build
on:
schedule:
- cron: '0 */3 * * *'
push:
tags:
- v*
pull_request:
env:
# NOTE: west docstrings will be extracted from the version listed here
WEST_VERSION: 1.2.0
# The latest CMake available directly with apt is 3.18, but we need >=3.20
# so we fetch that through pip.
CMAKE_VERSION: 3.20.5
DOXYGEN_VERSION: 1.12.0
JOB_COUNT: 4
jobs:
doc-file-check:
name: Check for doc changes
runs-on: ubuntu-24.04
if: >
github.repository_owner == 'zephyrproject-rtos'
outputs:
file_check: ${{ steps.check-doc-files.outputs.any_modified }}
steps:
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Check if Documentation related files changed
uses: tj-actions/changed-files@v45
id: check-doc-files
with:
files: |
doc/
boards/**/doc/
**.rst
include/
kernel/include/kernel_arch_interface.h
lib/libc/**
subsys/testsuite/ztest/include/**
tests/
**/Kconfig*
west.yml
scripts/dts/
doc/requirements.txt
.github/workflows/doc-build.yml
scripts/pylib/pytest-twister-harness/src/twister_harness/device/device_adapter.py
scripts/pylib/pytest-twister-harness/src/twister_harness/helpers/shell.py
doc-build-html:
name: "Documentation Build (HTML)"
needs: [doc-file-check]
if: >
github.repository_owner == 'zephyrproject-rtos' &&
( needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request' )
runs-on: ubuntu-24.04
timeout-minutes: 90
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
steps:
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install -y wget python3-pip git ninja-build graphviz lcov
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
echo "${HOME}/.local/bin" >> $GITHUB_PATH
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
- name: Rebase
if: github.event_name == 'pull_request'
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@v1
with:
app-path: zephyr
toolchains: 'all'
- name: install-pip
working-directory: zephyr
run: |
pip install -r doc/requirements.txt
pip install coverxygen
- name: build-docs
shell: bash
working-directory: zephyr
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
DOC_TARGET="html-fast"
else
DOC_TARGET="html"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-j ${JOB_COUNT} -W --keep-going -T" \
SPHINXOPTS_EXTRA="-q -t publish" \
make -C doc ${DOC_TARGET}
# API documentation coverage
python3 -m coverxygen --xml-dir doc/_build/html/doxygen/xml/ --src-dir include/ --output doc-coverage.info
# deprecated page causing issues
lcov --remove doc-coverage.info \*/deprecated > new.info
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: compress-docs
working-directory: zephyr
run: |
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --exclude html/_sources --exclude html/doxygen/xml --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf api-output.tar.xz --directory=doc/_build html/doxygen/html
tar --use-compress-program="xz -T0" -cf api-coverage.tar.xz coverage-report
- name: upload-build
uses: actions/upload-artifact@v4
with:
name: html-output
path: zephyr/html-output.tar.xz
- name: upload-api-coverage
uses: actions/upload-artifact@v4
with:
name: api-coverage
path: zephyr/api-coverage.tar.xz
- name: process-pr
if: github.event_name == 'pull_request'
run: |
REPO_NAME="${{ github.event.repository.name }}"
PR_NUM="${{ github.event.pull_request.number }}"
DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/"
API_DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/doxygen/html/"
API_COVERAGE_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/api-coverage/"
echo "${PR_NUM}" > pr_num
echo "Documentation will be available shortly at: ${DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Documentation will be available shortly at: ${API_DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: upload-pr-number
uses: actions/upload-artifact@v4
if: github.event_name == 'pull_request'
with:
name: pr_num
path: pr_num
doc-build-pdf:
name: "Documentation Build (PDF)"
needs: [doc-file-check]
if: |
github.event_name != 'pull_request' &&
github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-22.04
container: texlive/texlive:latest
timeout-minutes: 120
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Apply container owner mismatch workaround
run: |
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: checkout
uses: actions/checkout@v4
- name: install-pkgs
run: |
apt-get update
apt-get install -y python3-pip python3-venv ninja-build doxygen graphviz librsvg2-bin imagemagick
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ hashFiles('doc/requirements.txt') }}
- name: setup-venv
run: |
python3 -m venv .venv
. .venv/bin/activate
echo PATH=$PATH >> $GITHUB_ENV
- name: install-pip
run: |
pip install -r doc/requirements.txt
pip install west==${WEST_VERSION}
pip install cmake==${CMAKE_VERSION}
- name: west setup
run: |
west init -l .
- name: build-docs
shell: bash
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-q -j ${JOB_COUNT}" \
LATEXMKOPTS="-quiet -halt-on-error" \
make -C doc pdf
- name: upload-build
if: always()
uses: actions/upload-artifact@v4
with:
name: pdf-output
if-no-files-found: ignore
path: |
doc/_build/latex/zephyr.pdf
doc/_build/latex/zephyr.log
doc-build-status-check:
if: always()
name: "Documentation Build Status"
needs:
- doc-build-pdf
- doc-file-check
- doc-build-html
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

View File

@@ -1,84 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish (Pull Request)
on:
workflow_run:
workflows: ["Documentation Build"]
types:
- completed
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@v8
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Load PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: actions/github-script@v7
with:
script: |
let fs = require("fs");
let pr_number = Number(fs.readFileSync("./pr_num/pr_num"));
core.exportVariable("PR_NUM", pr_number);
- name: Check PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
id: check-pr
uses: carpentries/actions/check-valid-pr@v0.14.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
- name: Validate PR number
if: |
steps.download-artifacts.outputs.found_artifact == 'true' &&
steps.check-pr.outputs.VALID != 'true'
run: |
echo "ABORT: PR number validation failed!"
exit 1
- name: Uncompress HTML docs
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
if: steps.download-artifacts.outputs.found_artifact == 'true'
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
aws s3 sync --quiet html-output/html \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/docs \
--delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/api-coverage \
--delete
fi

View File

@@ -1,61 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish
on:
workflow_run:
workflows: ["Documentation Build"]
branches:
- main
- v*
types:
- completed
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@v8
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Uncompress HTML docs
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
if [ "${HEAD_BRANCH:0:1}" == "v" ]; then
VERSION=${HEAD_BRANCH:1}
else
VERSION="latest"
fi
aws s3 sync --quiet html-output/html s3://docs.zephyrproject.org/${VERSION} --delete
aws s3 sync --quiet html-output/html/doxygen/html s3://docs.zephyrproject.org/apidoc/${VERSION} --delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ s3://docs.zephyrproject.org/api-coverage/${VERSION} --delete
fi
aws s3 cp --quiet pdf-output/zephyr.pdf s3://docs.zephyrproject.org/${VERSION}/zephyr.pdf

View File

@@ -1,34 +0,0 @@
name: Error numbers
on:
pull_request:
paths:
- '.github/workflows/errno.yml'
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
jobs:
check-errno:
runs-on: ubuntu-22.04
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.27.4
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: checkout
uses: actions/checkout@v4
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Run errno.py
run: |
export ZEPHYR_BASE=${PWD}
./scripts/ci/errno.py

View File

@@ -1,128 +0,0 @@
name: Footprint Tracking
# Run every 12 hours and on tags
on:
schedule:
- cron: '50 1/12 * * *'
push:
paths:
- 'VERSION'
- '.github/workflows/footprint-tracking.yml'
branches:
- main
- v*-branch
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
footprint-tracking:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
defaults:
run:
shell: bash
strategy:
fail-fast: false
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install packages
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
pip install -U gitpython
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: west setup
run: |
west init -l . || true
west config --global update.narrow true
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Record Footprint
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=${PWD}
./scripts/footprint/track.py -p scripts/footprint/plan.txt
- name: Upload footprint data
run: |
python3 -m venv .venv
. .venv/bin/activate
pip install awscli
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/
- name: Transform Footprint data to Twister JSON reports
run: |
shopt -s globstar
export ZEPHYR_BASE=${PWD}
python3 ./scripts/footprint/pack_as_twister.py -vvv \
--plan ./scripts/footprint/plan.txt \
--test-name='name.feature' \
./footprint_data/**/
- name: Upload to ElasticSearch
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
ELASTICSEARCH_INDEX: ${{ vars.FOOTPRINT_TRACKING_INDEX }}
run: |
shopt -s globstar
pip install -U elasticsearch
run_date=`date --iso-8601=minutes`
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--flatten footprint \
--flatten-list-names "{'children':'name'}" \
--transform "{ 'footprint_name': '^(?P<footprint_area>([^\/]+\/){0,2})(?P<footprint_path>([^\/]*\/)*)(?P<footprint_symbol>[^\/]*)$' }" \
--run-id "${{ github.run_id }}" \
--run-attempt "${{ github.run_attempt }}" \
--run-workflow "footprint-tracking:${{ github.event_name }}" \
--run-branch "${{ github.ref_name }}" \
-i ${ELASTICSEARCH_INDEX} \
./footprint_data/**/twister_footprint.json
#

View File

@@ -1,58 +0,0 @@
name: Greet first time contributor
on:
issues:
types: [opened]
pull_request_target:
types: [opened, closed]
jobs:
check_for_first_interaction:
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- uses: actions/checkout@v4
- uses: zephyrproject-rtos/action-first-interaction@v1.1.1-zephyr-5
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-message: >
Hi @${{github.event.issue.user.login}}! We appreciate you submitting your first issue
for our open-source project. 🌟
Even though I'm a bot, I can assure you that the whole community is genuinely grateful
for your time and effort. 🤖💙
pr-opened-message: >
Hello @${{ github.event.pull_request.user.login }}, and thank you very much for your
first pull request to the Zephyr project!
Our Continuous Integration pipeline will execute a series of checks on your Pull Request
commit messages and code, and you are expected to address any failures by updating the PR.
Please take a look at [our commit message guidelines](https://docs.zephyrproject.org/latest/contribute/guidelines.html#commit-message-guidelines)
to find out how to format your commit messages, and at [our contribution workflow](https://docs.zephyrproject.org/latest/contribute/guidelines.html#contribution-workflow)
to understand how to update your Pull Request.
If you haven't already, please make sure to review the project's [Contributor
Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html)
and update (by amending and force-pushing the commits) your pull request if necessary.
If you are stuck or need help please join us on [Discord](https://chat.zephyrproject.org/)
and ask your question there. Additionally, you can [escalate the review](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html#pr-review-escalation)
when applicable. 😊
pr-merged-message: >
Hi @${{ github.event.pull_request.user.login }}!
Congratulations on getting your very first Zephyr pull request merged 🎉🥳. This is a
fantastic achievement, and we're thrilled to have you as part of our community!
To celebrate this milestone and showcase your contribution, we'd love to award you the
Zephyr Technical Contributor badge. If you're interested, please claim your badge by
filling out this form: [Claim Your Zephyr Badge](https://forms.gle/oCw9iAPLhUsHTapc8).
Thank you for your valuable input, and we look forward to seeing more of your
contributions in the future! 🪁

View File

@@ -1,82 +0,0 @@
name: Hello World (Multiplatform)
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/**'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04, ubuntu-24.04, macos-13, macos-14, windows-2022]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
path: zephyr
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
shell: bash
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@v1
with:
app-path: zephyr
toolchains: all
- name: Build firmware
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
fi
./scripts/twister --runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@v4
with:
if-no-files-found: ignore
path:
zephyr/twister-out/*/samples/hello_world/sample.basic.helloworld/build.log
zephyr/twister-out/*/samples/cpp/hello_world/sample.cpp.helloworld/build.log

View File

@@ -1,54 +0,0 @@
name: Issue Tracker
on:
schedule:
- cron: '*/10 * * * *'
env:
OUTPUT_FILE_NAME: IssuesReport.md
COMMITTER_EMAIL: actions@github.com
COMMITTER_NAME: github-actions
COMMITTER_USERNAME: github-actions
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download configuration file
run: |
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/main/.github/workflows/issues-report-config.json
- name: install-packages
run: |
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@v4
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
outputPath: ${{ env.OUTPUT_FILE_NAME }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Post Results
run: |
mkd2html IssuesReport.md IssuesReport.html
aws s3 cp --quiet IssuesReport.html s3://testing.zephyrproject.org/issues/$GITHUB_REPOSITORY/index.html

View File

@@ -1,37 +0,0 @@
[
{
"section": "High Priority Bugs",
"labels": ["bug", "priority: high"],
"threshold": 0
},
{
"section": "Medium Priority Bugs",
"labels": ["bug", "priority: medium"],
"threshold": 20
},
{
"section": "Low Priority Bugs",
"labels": ["bug", "priority: low"],
"threshold": 100
},
{
"section": "Enhancements",
"labels": ["Enhancement"],
"threshold": 500
},
{
"section": "Features",
"labels": ["Feature"],
"threshold": 100
},
{
"section": "Questions",
"labels": ["question"],
"threshold": 100
},
{
"section": "Static Analysis",
"labels": ["Coverity"],
"threshold": 100
}
]

View File

@@ -1,34 +0,0 @@
name: Scancode
on: [pull_request]
jobs:
scancode_job:
runs-on: ubuntu-22.04
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@v4
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@v4
with:
name: scancode
path: ./artifacts
- name: Verify
run: |
if [ -s ./artifacts/report.txt ]; then
report=$(cat ./artifacts/report.txt)
report="${report//'%'/'%25'}"
report="${report//$'\n'/'%0A'}"
report="${report//$'\r'/'%0D'}"
echo "::error file=./artifacts/report.txt::$report"
exit 1
fi

View File

@@ -1,39 +0,0 @@
name: Manifest
on:
pull_request_target:
jobs:
contribs:
runs-on: ubuntu-22.04
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
path: zephyrproject/zephyr
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip install west
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@v1.7.0
with:
github-token: ${{ secrets.ZB_GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
check-impostor-commits: 'true'
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM (manifest)'

View File

@@ -1,54 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Misc. Pylib Scripts TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run build_helpers tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/build_helpers

View File

@@ -1,28 +0,0 @@
name: ready to merge
on:
workflow_call:
inputs:
needs_context:
type: string
required: true
jobs:
all_jobs_passed:
name: all jobs passed
runs-on: ubuntu-latest
steps:
- name: "Check status of all required jobs"
run: |-
NEEDS_CONTEXT='${{ inputs.needs_context }}'
JOB_IDS=$(echo "$NEEDS_CONTEXT" | jq -r 'keys[]')
for JOB_ID in $JOB_IDS; do
RESULT=$(echo "$NEEDS_CONTEXT" | jq -r ".[\"$JOB_ID\"].result")
echo "$JOB_ID job result: $RESULT"
if [[ $RESULT != "success" && $RESULT != "skipped" ]]; then
echo "***"
echo "Error: The $JOB_ID job did not pass."
exit 1
fi
done
echo "All jobs passed or were skipped."

View File

@@ -1,60 +0,0 @@
name: Create a Release
on:
push:
tags:
- 'v*'
- '!v*rc*'
jobs:
release:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get the version
id: get_version
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
- name: REUSE Compliance Check
uses: fsfe/reuse-action@v5
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: Create empty release notes body
run: |
echo "TODO: add release overview and notes link" > release-notes.txt
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Zephyr ${{ steps.get_version.outputs.TRIMMED_VERSION }}
body_path: release-notes.txt
draft: true
prerelease: true
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_content_type: text/plain

View File

@@ -1,61 +0,0 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.
name: Scorecards supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '43 7 * * 6'
push:
branches:
- main
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed for Code scanning upload
security-events: write
# Needed for GitHub OIDC token if publish_results is true
id-token: write
steps:
- name: "Checkout code"
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@62b2cac7ed8198b15735ed49ab1e5cf35480ba46 # v2.4.0
with:
results_file: results.sarif
results_format: sarif
# Publish results to OpenSSF REST API for easy access by consumers.
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
publish_results: true
# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@89ef406dd8d7e03cfd12d9e0a4a378f454709029 # v4.3.5
with:
name: SARIF file
path: results.sarif
retention-days: 5
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@afb54ba388a7dca6ecae48f608c4ff05ff4cc77a # v3.25.15
with:
sarif_file: results.sarif

View File

@@ -1,74 +0,0 @@
# Copyright 2023 Google LLC
# SPDX-License-Identifier: Apache-2.0
name: Scripts tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
jobs:
scripts-tests:
name: Scripts tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-20.04]
steps:
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest
env:
ZEPHYR_BASE: ./
run: |
echo "Run script tests"
pytest ./scripts/build

View File

@@ -1,28 +0,0 @@
name: Stale Workflow Queue Cleanup
on:
workflow_dispatch:
branches: [main]
schedule:
# everyday at 15:00
- cron: '0 15 * * *'
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-22.04
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day
older-than-seconds: 86400
status: queued
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,28 +0,0 @@
name: "Close stale pull requests/issues"
on:
schedule:
- cron: "16 00 * * *"
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- uses: actions/stale@v9
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this pull request will automatically be
closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this issue will automatically be closed in
14 days. Note, that you can always re-open a closed issue at any time.'
days-before-stale: 60
days-before-close: 14
stale-issue-label: 'Stale'
stale-pr-label: 'Stale'
exempt-pr-labels: 'Blocked,In progress'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta,Process,Coverity'
operations-per-run: 400

View File

@@ -1,24 +0,0 @@
name: Merged PR stats
on:
pull_request_target:
branches:
- main
- v*-branch
types: [closed]
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-22.04
steps:
- name: checkout
uses: actions/checkout@v4
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
pip install pygithub elasticsearch
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -1,147 +0,0 @@
name: Prepare For a Twister Run
on:
workflow_call:
outputs:
subset:
description: subset
value: ${{ jobs.prep_push.outputs.subset != '' && jobs.prep_push.outputs.subset || jobs.prep_pr.outputs.subset }}
size:
description: size
value: ${{ jobs.prep_push.outputs.size != '' && jobs.prep_push.outputs.size || jobs.prep_pr.outputs.size }}
fullrun:
description: fullrun
value: ${{ jobs.prep_push.outputs.fullrun != '' && jobs.prep_push.outputs.fullrun || jobs.prep_pr.outputs.size }}
jobs:
prep_pr:
if: github.repository_owner == 'zephyrproject-rtos' && github.event_name == 'pull_request_target'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci,+optional
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Generate Test Plan with Twister
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --no-detailed-test-id --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
echo "TWISTER_NODES=${MATRIX_SIZE}" >> $GITHUB_ENV
fi
rm -f testplan.json .testplan
- name: Determine matrix size
id: output-services
run: |
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
prep_push:
if: github.repository_owner == 'zephyrproject-rtos' && (github.event_name == 'push' || github.event_name == 'schedule')
runs-on: ubuntu-22.04
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${WEEKLY_MATRIX_SIZE})]"
size=${WEEKLY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT

View File

@@ -1,53 +0,0 @@
name: Publish Twister Test Results
on:
workflow_run:
workflows: ["Run tests with twister"]
branches:
- main
types:
- completed
jobs:
upload-to-elasticsearch:
if: |
github.repository == 'zephyrproject-rtos/zephyr' &&
github.event.workflow_run.event != 'pull_request_target'
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
runs-on: ubuntu-22.04
steps:
# Needed for elasticearch and upload script
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@v8
with:
path: artifacts
workflow: twister.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Upload to elasticsearch
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
pip install elasticsearch
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event.workflow_run.event}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event.workflow_run.event}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi

View File

@@ -1,282 +0,0 @@
name: Run tests with twister
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request_target:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 17:00 UTC on every Saturday
- cron: '0 17 * * 6'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
twister-build-prep:
uses: ./.github/workflows/twister-prep.yaml
twister-build:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: ${{fromJSON(needs.twister-build-prep.outputs.subset)}}
timeout-minutes: 1440
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resolve the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: '--no-detailed-test-id --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
WEEKLY_OPTIONS: ' -M --build-only --all --show-footprint --report-filtered'
PR_OPTIONS: ' --clobber-output --integration'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-16
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request_target" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config manifest.group-filter -- +ci,+optional
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
gcc --version
cargo --version
rustup target list --installed
ls -la
echo "github.ref: ${{ github.ref }}"
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
id: run_twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${PUSH_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PUSH_OPTIONS}
fi
fi
- if: github.event_name == 'pull_request_target'
name: Run Tests with Twister (Pull Request)
id: run_twister_pr
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request --no-detailed-test-id
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PR_OPTIONS}
fi
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Weekly)
id: run_twister_sched
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
fi
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
path: |
twister-out/twister.xml
twister-out/twister.json
module_tests/twister.xml
testplan.json
- if: matrix.subset == 1 && github.event_name == 'push'
name: Save the list of Python packages
shell: bash
run: |
FREEZE_FILE="frozen-requirements.txt"
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@v4
with:
name: Frozen PIP package set
path: |
frozen-requirements.txt
twister-test-results:
name: "Publish Unit Tests Results"
needs:
- twister-build
runs-on: ubuntu-22.04
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: Check out source code
if: needs.twister-build.result == 'failure'
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Merge Test Results
run: |
pip install junitparser junit2html
junitparser merge artifacts/*/*/twister.xml junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Unit Test Results
if-no-files-found: ignore
path: |
junit.html
junit.xml
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off
- name: Analyze Twister Reports
if: needs.twister-build.result == 'failure'
run: |
./scripts/ci/twister_report_analyzer.py artifacts/*/*/twister.json --long-summary --platforms --output-md errors.md
if [[ -s "errors.md" ]]; then
echo '### Error Summary! 🚀' >> $GITHUB_STEP_SUMMARY
cat errors.md >> $GITHUB_STEP_SUMMARY
fi
- name: Upload Twister Analysis Results
if: needs.twister-build.result == 'failure'
uses: actions/upload-artifact@v4
with:
name: Twister Analysis Results
if-no-files-found: ignore
path: |
twister_report_summary.json
twister-status-check:
if: always()
name: "Check Twister Status"
needs:
- twister-build-prep
- twister-build
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

View File

@@ -1,69 +0,0 @@
# Copyright (c) 2020 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister TestSuite
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
jobs:
twister-tests:
name: Twister Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run pytest for twisterlib
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/twister
- name: Run pytest for pytest-twister-harness
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
PYTHONPATH: ./scripts/pylib/pytest-twister-harness/src:${PYTHONPATH}
run: |
echo "Run twister tests"
pytest ./scripts/pylib/pytest-twister-harness/tests

View File

@@ -1,80 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister BlackBox TestSuite
on:
pull_request:
branches:
- main
- collab-*
- v*-branch
paths:
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
jobs:
twister-tests:
name: Twister Black Box Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04]
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.27.4
steps:
- name: Apply Container Owner Mismatch Workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Checkout
uses: actions/checkout@v4
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
west init -l . || true
# we do not depend on any hals, tools or bootloader, save some time and space...
west config manifest.group-filter -- -hal,-tools,-bootloader,-babblesim
west config manifest.project-filter -- -nrf_hw_models
west config --global update.narrow true
west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /github/cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Go Into Venv
shell: bash
run: |
python3 -m pip install --user virtualenv
python3 -m venv env
source env/bin/activate
echo "$(which python)"
- name: Install Packages
run: |
python3 -m pip install -U -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run Pytest For Twister Black Box Tests
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox

View File

@@ -1,76 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Zephyr West Command Tests
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
jobs:
west-commnads:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install pytest
run: |
pip install pytest west pyelftools canopen natsort progress mypy intelhex psutil ply pyserial anytree junitparser
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |
./scripts/west_commands/run_tests.py

98
.gitignore vendored
View File

@@ -7,30 +7,17 @@
*.swp
*.swo
*~
# Emacs
\#*\#
build*/
!doc/build/
!scripts/build
!tests/drivers/build_all
!scripts/pylib/build_helpers
cscope.*
.dir
/*.patch
# The .cache directory will be used to cache toolchain capabilities if
# no suitable out-of-tree directory is found.
.cache
outdir
outdir-*
scripts/basic/fixdep
scripts/gen_idt/gen_idt
coverage-report
doc-coverage.info
scripts/kconfig/conf
scripts/kconfig/mconf
scripts/kconfig/zconf.hash.c
scripts/kconfig/zconf.lex.c
scripts/kconfig/zconf.tab.c
doc/_build
doc/doxygen
doc/xml
@@ -39,73 +26,12 @@ doc/boards
doc/samples
doc/latex
doc/themes/zephyr-docs-theme
sanity-out*
twister-out*
bsim_out
bsim_bt_out
myresults.xml
tests/RunResults.xml
sanity-out/
scripts/grub
doc/reference/kconfig/*.rst
doc/doc.warnings
.*project
.settings
.envrc
.vscode
hide-defaults-note
venv
.venv
.DS_Store
.clangd
new.info
# Cargo drops lock files in projects to capture resolved dependencies.
# We don't want to record these.
Cargo.lock
# Cargo encourages a .cargo/config.toml file to symlink to a generated file. Don't save these.
.cargo/
# Normal west builds will place the Rust target directory under the build directory. However,
# sometimes IDEs and such will litter these target directories as well.
target/
# CI output
compliance.xml
_error.types
# Tag files
GPATH
GRTAGS
GTAGS
TAGS
doc/reference/kconfig/CONFIG_*
doc/reference/kconfig/index.rst
tags
.idea
# from check_compliance.py
BinaryFiles.txt
BoardYml.txt
Checkpatch.txt
ClangFormat.txt
DevicetreeBindings.txt
GitDiffCheck.txt
Gitlint.txt
Identity.txt
ImageSize.txt
Kconfig.txt
KconfigBasic.txt
KconfigBasicNoModules.txt
KconfigHWMv2.txt
KeepSorted.txt
MaintainersFormat.txt
ModulesMaintainers.txt
Nits.txt
Pylint.txt
Ruff.txt
SphinxLint.txt
SysbuildKconfig.txt
SysbuildKconfigBasic.txt
SysbuildKconfigBasicNoModules.txt
TextEncoding.txt
YAMLLint.txt
.project
.cproject
.xxproject
.envrc

View File

@@ -1,14 +1,10 @@
# All these sections are optional, edit this file as you like.
# Zephyr-specific defaults are located in scripts/gitlint/zephyr_commit_rules.py
[general]
ignore=title-trailing-punctuation, T3, title-max-length, T1, body-hard-tab, B3, B1
ignore=title-trailing-punctuation, T3, title-max-length, T1
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
verbosity = 3
# By default gitlint will ignore merge commits. Set to 'false' to disable.
ignore-merge-commits=false
ignore-revert-commits=false
ignore-fixup-commits=false
ignore-squash-commits=false
ignore-merge-commits=true
# Enable debug mode (prints more output). Disabled by default
debug = false
@@ -17,22 +13,19 @@ debug = false
extra-path=scripts/gitlint
[title-max-length-no-revert]
# line-length=75
[body-min-line-count]
# min-line-count=1
line-length=72
[body-max-line-count]
# max-line-count=200
max-line-count=200
[title-starts-with-subsystem]
regex = ^(?!subsys:)(?!treewide:)(([^:]+):)(\s([^:]+):)*\s(.+)$
regex = ^(([^:]+):)(\s([^:]+):)*\s(.+)$
[title-must-not-contain-word]
# Comma-separated list of words that should not occur in the title. Matching is case
# insensitive. It's fine if the keyword occurs as part of a larger word (so "WIPING"
# will not cause a violation, but "WIP: my title" will.
words=wip
words=wip,title
[title-match-regex]
# python like regex (https://docs.python.org/2/library/re.html) that the
@@ -43,7 +36,7 @@ words=wip
[max-line-length-with-exceptions]
# B1 = body-max-line-length
# line-length=75
line-length=72
[body-min-length]
min-length=3

55
.known-issues/README Normal file
View File

@@ -0,0 +1,55 @@
This directory contains configuration files to ignore errors found in
the build and test process which are known to the developers and for
now can be safely ignored.
To use:
$ cd zephyr
$ make SOMETHING >& result
$ scripts/filter-known-issues.py result
It is included in the source tree so if anyone has to submit anything
that triggers some kind of error that is a false positive, it can
include the "ignore me" file, properly documented.
Each file can contain one or more multiline Python regular expressions
(https://docs.python.org/2/library/re.html#regular-expression-syntax)
that match an error message. Multiple regular expressions are
separated by comment blocks (that start with #). Note that an empty
line still is considered part of the multiline regular expression.
For example
---beginning---
#
# This testcase always fails, pending fix ZEP-1234
#
.*/tests/kernel/grumpy .* FAIL
#
# Documentation issue, masks:
#
# /home/e/inaky/z/kernel.git/doc/api/io_interfaces.rst:28: WARNING: Invalid definition: Expected identifier in nested name. [error at 19]
# struct dev_config::@65 dev_config::bits
# -------------------^
#
^(?P<filename>.+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^\s+struct dev_config::@[0-9]+ dev_config::bits.*
^\s+-+\^
---end---
Note you want to:
- use relateive paths; instead of
/home/me/mydir/zephyr/something/somewhere.c you will want
^.*/something/somewhere.c (as they will depend on where it is being
built)
- Replace line numbers with [0-9]+, as they will change
- (?P<filename>[-._/\w]+/something/somewhere.c) saves the match on
that file path in a "variable" called 'filename' that later you can
match with (?P=filename) if you want to match multiple lines of the
same error message.
Can get really twisted and interesting in terms of regexps; they are
powerful, so start small :)

View File

@@ -0,0 +1,42 @@
#
# Bluetooth unnamed struct definition
#
# FIXME: all these should match the relative filename
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]$
^.*bt_conn_info.__unnamed__.*$
^[- \t]*\^$
#
# bt_gatt_discover_params unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*bt_gatt_discover_params.__unnamed__.*
^[- \t]*\^
#
# Bluetooth GATT unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*bt_gatt_read_params.__unnamed__.*
^[- \t]*\^
#
# Bluetooth packed
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: cpp:typeOrConcept targets a member \(__packed\).$

View File

@@ -0,0 +1,6 @@
#
^(?P<filename>([-:\\/\w\.])+[/\\]doc[/\\]api[/\\]file_system.rst):(?P<lineno>[0-9]+): WARNING: Duplicate declaration.
#
^(?P<filename>([-:\\/\w\.])+[/\\]doc[/\\]api[/\\]io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Duplicate declaration.
#
^(?P<filename>([-:\\/\w\.])+[/\\]doc[/\\]subsystems[/\\]sensor.rst):(?P<lineno>[0-9]+): WARNING: Duplicate declaration.

View File

@@ -0,0 +1,18 @@
#
# KERNELVERSION not being defined in local builds, kill that warning,
# can ignore it
#
^.*/Kconfig.zephyr:[0-9]+: warning: The symbol KERNELVERSION references the non-existent environment variable KERNELVERSION.*
#
# Documentation generation, early message
#
^cd .* && doxygen doc/doxygen.config
^srctree=.* SRCARCH=\w+ python scripts/genrest/genrest.py .*$
# This cuts the sphinx build line; has to be separate because in the
# middle, we have removed the KERNELVERSION one and a full regex won't match
^sphinx-build -t \w+ -b html .*
#
# Documentation generation, footer message
#
^[ \t]*
^Build finished. The HTML pages are in [-._/\w]+

View File

@@ -0,0 +1,14 @@
#
# Kernel unnamed struct definition
#
# FIXME: all these should match the relative filename
#
^(?P<filename>[-._/\w]+/doc/api/kernel_api.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]$
^.*k_msg.extra.*$
^[- \t]*\^$

View File

@@ -0,0 +1,113 @@
#
# Networking
#
#
# include/net/net_ip.h warnings
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*in[_6]+addr.in[46]_u
^[- \t]*\^
#
# include/net/net_if.h warnings
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Error when parsing function declaration.
^If the function has no return type:
^[ \t]*Error in declarator or parameters and qualifiers
^[ \t]*Invalid definition: Expected identifier in nested name, got keyword: struct \[error at [0-9]+]
^.*struct net_if __aligned\(32\)
^[- \t]*\^
^If the function has a return type:
^[ \t]*Error in declarator or parameters and qualifiers
^[ \t]*If pointer to member declarator:
^[ \t]*Invalid definition: Expected \'::\' in pointer to member \(function\). \[error at [0-9]+]
^.*struct net_if __aligned\(32\)
^[- \t]*\^
^[ \t]*If declarator-id:
^[ \t]*Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^.*struct net_if __aligned\(32\)
^[- \t]*\^
#
# include/net/net_mgmt.h
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*net_mgmt_event_callback.__unnamed__
^[- \t]*\^
#
# include/net/buf.h
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*net_buf.__unnamed__
^[- \t]*\^
#
# include/net/ieee802154.h
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*ieee802154_req_params.__unnamed__
^[- \t]*\^
#
# Various warning about net_mgmt declarations
#
^(?P<filename>[-._/\w]+/doc/api/networking.rst):(?P<lineno>[0-9]+): WARNING: Error when parsing function declaration\.
^If the function has no return type\:
^[ \t]*Error in declarator or parameters and qualifiers
^[ \t]*Invalid definition: Expected identifier in nested name\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^
#
^If the function has a return type\:
^[ \t]*Error in declarator
^[ \t]*If pointer to member declarator:
^[ \t]*Invalid definition: Expected identifier in nested name\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^
#
^[ \t]*If declId, parameters, and qualifiers\:
^[ \t]*Invalid definition: Expected identifier in nested name\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^
#
^[ \t]*If parenthesis in noptr-declarator\:
^[ \t]*Error in declarator
^[ \t]*If pointer to member declarator:
^[ \t]*Invalid definition: Expected identifier in nested name\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^
#
^[ \t]*If parenthesis in noptr-declarator\:
^[ \t]*Error in declarator or parameters and qualifiers
^[ \t]*If pointer to member declarator:
^[ \t]*Invalid definition: Expected \'\:\:\' in pointer to member \(function\)\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^
#
^[ \t]*If declarator-id:
^[ \t]*Invalid definition: Expecting \"\(\" in parameters_and_qualifiers\. \[error at [0-9]+\]
^[ \t]*NET_MGMT_DEFINE_.*?$
^[- \t]*\^

View File

@@ -0,0 +1,12 @@
#
# Sensor value unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*sensor_value.__unnamed__.*
^[- \t]*\^

View File

@@ -0,0 +1,15 @@
#
# UART unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*uart_device_config.__unnamed__.*
^[- \t]*\^

6
.known-issues/make.conf Normal file
View File

@@ -0,0 +1,6 @@
#
# When filtering output of the build process, ignore lines that don't
# provide any information that helps the invoker tell if there was an
# error.
#
^make: (Entering|Leaving) directory .*

View File

@@ -0,0 +1,11 @@
#
# When executing test cases, ignore the following messages as they are
# not to be considered hard errors.
#
# Block line when test case cannot run in the HW due to server or connection issues
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/[^:]+:[^:]+: evaluation blocked(.*)$
#
# Block line when there is an issue with the YKUSH serial connection
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/(?P<board>[^:]+):[^:]+: exception: 400: (?P=board): Cannot find YKUSH serial '[A-Z0-9]+'$

View File

@@ -0,0 +1,18 @@
#
# When executing test cases under TCF, ignore the following messages
# as they are not to be considered hard errors.
#
# TCF is run under make for taking advantage of the jobserver; when
# the testcase execution fail, make will complain, which we can
# ignore ('sommersault' was the old name of the target).
#
^/tmp/tcf-[a-zA-Z0-9]+.mk:[0-9]+: recipe for target ('tcf-jobserver-run'|'sommersault') failed$
#
# More of the same
#
^make: \*\*\* \[(tcf-jobserver-run|sommersault)\] Error 1$
#
# TCF's summary line. We don't need to consider it to determine if the
# run failed or passed.
#
^[A-Z]+0/\S+:\s+\S+\s+@\S+: [0-9]+ tests \([0-9]+ passed, [0-9]+ failed, [0-9]+ blocked, [0-9]+ skipped\).*$

View File

@@ -0,0 +1,4 @@
#
# Skip line when test case is eliminated due to filters
#
^SKIP0/\S+\s+\S+: No targets can be used \(all [0-9]+ selected from [0-9]+ available eliminated by testcase filtering\)$

143
.mailmap
View File

@@ -1,124 +1,21 @@
Alexandr Kolosov <rikorsev@gmail.com>
Alexandre d'Alton <alexandre.dalton@intel.com>
Amir Kaplan <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com>
Andrzej Kuroś <andrzej.kuros@nordicsemi.no>
Anthony Smigielski <thebasti0ncode@gmail.com>
Armand Ciejak <armand@riedonetworks.com> <armandciejak@users.noreply.github.com>
Aska Wu <aska.wu@linaro.org>
Bit Pathe <bitpathe@gmail.com>
Bjarki Arge Andreasen <baa@trackunit.com>
Carles Cufi <carles.cufi@nordicsemi.no>
chao an <anchao@xiaomi.com>
Charles E. Youse <charles.youse@intel.com>
Chen Xingyu <hi@xingrz.me>
Christoph Schnetzler <christoph.schnetzler@husqvarnagroup.com>
Christoph Schramm <schramm@makaio.com>
Christopher Friedt <cfriedt@meta.com>
Christopher Friedt <cfriedt@meta.com> <cfriedt@fb.com>
Chuck Jordan <cjordan@synopsys.com>
Chunlin Han <chunlin.han@linaro.org> <chunlin.han@acer.com>
David B. Kinder <david.b.kinder@intel.com>
David Komel <a8961713@gmail.com>
David Leach <david.leach@nxp.com>
Dirk Brandewie <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com>
Enjia Mai <enjia.mai@intel.com>
Enjia Mai <enjia.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com> <Eugeniy.Paltsev@synopsys.com>
Felipe Neves <ryukokki.felipe@gmail.com>
Findlay Feng <i@fengch.me>
Flavio Arieta Netto <flavio@exati.com.br>
Francois Ramu <francois.ramu@st.com>
Gerardo Aceves <gerardo.aceves@intel.com>
Gregory Shue <gregory.shue@legrand.com>
Gregory Shue <gregory.shue@legrand.com> <gregory.shue@legrand.us>
HaiLong Yang <cameledyang@pm.me>
James Johnson <james.johnson672@t-mobile.com>
Jarno Lämsä <jarno.lamsa@nordicsemi.no>
Jędrzej Ciupis <jedrzej.ciupis@nordicsemi.no>
Jeremie Garcia <jeremie.garcia@intel.com>
Jim Benjamin Luther <jilu@oticon.com>
Johan Kruger <johan.kruger@windriver.com>
Johann Fischer <j.fischer@phytec.de>
Jørgen Kvalvaag <jorgen.kvalvaag@nordicsemi.no>
Juan Manuel Cruz Alcaraz <juan.m.cruz.alcaraz@intel.com>
Juan Solano <juanx.solano.menacho@intel.com>
Julien D'Ascenzio <julien.dascenzio@paratronic.fr>
Jun Li <jun.r.li@intel.com>
Justin Watson <jwatson5@gmail.com>
Kamil Sroka <kamil.sroka@nordicsemi.no>
Katarzyna Giądła <katarzyna.giadla@nordicsemi.no>
Keren Siman-Tov <keren.siman-tov@intel.com>
Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com>
Dirk Brandewie <dirk.j.brandewie@intel.com> <dirk.j.brandewie@intel.com>
Mike Hirst <michael.hirst@windriver.com> <michael.hirst@windriver.com>
Johan Kruger <johan.kruger@windriver.com> <johan.kruger@windriver.com>
Leona Cook <leonax.cook@intel.com> <lsc@hackeress.com>
Lixin Guo <lixinx.guo@intel.com>
Łukasz Mazur <lukasz.mazur@hidglobal.com>
Manuel Argüelles <manuel.arguelles@nxp.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <manuel.arguelles@coredumplabs.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <marguelles.dev@gmail.com>
Marc Herbert <marc.herbert@intel.com> <46978960+marc-hb@users.noreply.github.com>
Marin Jurjević <marin.jurjevic@hotmail.com>
Mariusz Ryndzionek <mariusz.ryndzionek@firmwave.com>
Mariusz Skamra <mariusz.skamra@codecoup.pl>
Mariusz Skamra <mariusz.skamra@codecoup.pl> <mariusz.skamra@tieto.com>
Martí Bolívar <marti.bolivar@nordicsemi.no>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.bolivar@linaro.org>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.f.bolivar@gmail.com>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@foundries.io>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@opensourcefoundries.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Mateusz Hołenko <mholenko@antmicro.com>
Michael Rosen <michael.r.rosen@intel.com>
Michal Narajowski <michal.narajowski@codecoup.pl>
Mike Hirst <michael.hirst@windriver.com>
Ming Shao <ming.shao@intel.com>
Mohan Kumar Kumar <mohankm@fb.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com>
Navin Sankar Velliangiri <navin@linumiz.com>
NingX Zhao <ningx.zhao@intel.com>
Nishikant Nayak <nishikantax.nayak@intel.com>
Ole Sæther <ole.saether@nordicsemi.no>
Pavel Král <pavel.kral@omsquare.com>
Pavel Vasilyev <pavel.vasilyev@nordicsemi.no>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com> <pczarnecki@internships.antmicro.com>
Paweł Kwiek <pawel.kwiek@nordicsemi.no>
Peng Chen <peng1.chen@intel.com>
Peter Bigot <peter.bigot@nordicsemi.no>
Peter Bigot <peter.bigot@nordicsemi.no> <pab@pabigot.com>
Peter Johanson <peter@peterjohanson.com>
Piyush Itankar <piyush.t.itankar@intel.com>
Radosław Koppel <radoslaw.koppel@nordicsemi.no>
Radosław Koppel <radoslaw.koppel@nordicsemi.no> <r.koppel@k-el.com>
Raja D. Singh <rdsingh@iotwizards.com>
Ricardo Salveti <ricardo@opensourcefoundries.com>
Ricardo Salveti <ricardo@opensourcefoundries.com> <ricardo.salveti@linaro.org>
Ruud Derwig <Ruud.Derwig@synopsys.com>
Saku Rautio <saku.rautio@nordicsemi.no>
Scott Worley <scott.worley@microchip.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>
Sharron Liu <sharron.liu@intel.com>
Shilpashree L C <shilpashree.lc@intel.com>
Shuang He <shuang.he@intel.com>
Sigvart Hovland <sigvart.hovland@nordicsemi.no>
Stéphane D'Alu <sdalu@sdalu.com>
Stine Åkredalen <stine.akredalen@nordicsemi.no>
Thomas Heeley <thomas.heeley@intel.com>
Tim Sørensen <tims@demant.com>
Tim Sørensen <tims@demant.com> <tims@oticon.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa@gmail.com>
Xiaorui Hu <xiaorui.hu@linaro.org>
Yannis Damigos <giannis.damigos@gmail.com> <ydamigos@iccs.gr>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com> <yonattan.a.louise.mendoza@linux.intel.com>
YouhuaX Zhu <youhuax.zhu@intel.com>
Leona Cook <leonax.cook@intel.com> <leonax.cook@intel.com>
Thomas Heeley <thomas.heeley@intel.com> <thomas.heeley@intel.com>
Shuang He <shuang.he@intel.com> <shuang.he@intel.com>
Chuck Jordan <cjordan@synopsys.com> <cjordan@synopsys.com>
Jeremie Garcia <jeremie.garcia@intel.com> <jeremie.garcia@intel.com>
Bit Pathe <bitpathe@gmail.com> <bitpathe@gmail.com>
Carles Cufi <carles.cufi@nordicsemi.no> <carles.cufi@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com> <kuo-lang.tseng@intel.com>
Gerardo Aceves <gerardo.aceves@intel.com> <gerardo.aceves@intel.com>
Evan Couzens <evanx.couzens@intel.com> <evanx.couzens@intel.com>
Lei Liu <lei.a.liu@intel.com> <lei.a.liu@intel.com>
Douglas Su <d0u9.su@outlook.com> <d0u9.su@outlook.com>
Keren Siman-Tov <keren.siman-tov@intel.com> <keren.siman-tov@intel.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com> <tulasi.r@tcs.com>
Felipe Neves <ryukokki.felipe@gmail.com> <ryukokki.felipe@gmail.com>
Amir Kaplan <amir.kaplan@intel.com> <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com> <anas.nashif@intel.com>

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +0,0 @@
# Copyright (c) 2024 Basalte bv
# SPDX-License-Identifier: Apache-2.0
extend = ".ruff-excludes.toml"
line-length = 100
target-version = "py310"
[lint]
select = [
# zephyr-keep-sorted-start
"B", # flake8-bugbear
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"SIM", # flake8-simplify
"UP", # pyupgrade
"W", # pycodestyle warnings
# zephyr-keep-sorted-stop
]
ignore = [
# zephyr-keep-sorted-start
"SIM108", # Allow if-else blocks instead of forcing ternary operator
# zephyr-keep-sorted-stop
]
[format]
quote-style = "preserve"
line-ending = "lf"

132
.shippable.yml Normal file
View File

@@ -0,0 +1,132 @@
language: c
compiler: gcc
env:
global:
- SDK=0.9.2
- SANITYCHECK_OPTIONS=" --inline-logs -R"
- SANITYCHECK_OPTIONS_RETRY="${SANITYCHECK_OPTIONS} --only-failed --outdir=out-2nd-pass"
- ZEPHYR_SDK_INSTALL_DIR=/opt/sdk/zephyr-sdk-0.9.2
- ZEPHYR_GCC_VARIANT=zephyr
- USE_CCACHE=1
- MATRIX_BUILDS="2"
- MATRIX_BUILDS_EXTRA="3"
matrix:
- MATRIX_BUILD="1"
- MATRIX_BUILD="2"
- MATRIX_BUILD="3"
build:
cache: true
cache_dir_list:
- ${SHIPPABLE_BUILD_DIR}/ccache
pre_ci_boot:
image_name: zephyrprojectrtos/ci
image_tag: v0.2
pull: true
options: "-e HOME=/home/buildslave --privileged=true --tty --net=bridge --user buildslave"
ci:
- export CCACHE_DIR=${SHIPPABLE_BUILD_DIR}/ccache/.ccache
- >
if [ "$IS_PULL_REQUEST" = "true" ]; then
git rebase origin/${PULL_REQUEST_BASE_BRANCH};
fi
- source zephyr-env.sh
- ccache -c -s --max-size=2000M
- make host-tools
- export PREBUILT_HOST_TOOLS=${ZEPHYR_BASE}/bin
- >
if [ "$MATRIX_BUILD" = "3" -a "$IS_PULL_REQUEST" = "true" ]; then
export COMMIT_RANGE=origin/${PULL_REQUEST_BASE_BRANCH}..HEAD
echo "Building a Pull Request";
echo "- Building Documentation";
echo "Commit range:" ${COMMIT_RANGE}
make htmldocs > doc.log 2>&1;
./scripts/filter-known-issues.py --config-dir .known-issues/doc/ doc.log > doc.warnings;
if [ "$?" != 0 ]; then
echo " ==> Error running filter script"
exit 1
fi;
if [ -s doc.warnings ]; then
echo " => New documentation warnings/errors";
fi;
echo "- Verify commit message and coding style";
./scripts/ci/check-compliance.py --commits ${COMMIT_RANGE} || true;
fi;
- >
if [ "$IS_PULL_REQUEST" = "true" ]; then
./scripts/ci/get_modified_tests.py --commits origin/${PULL_REQUEST_BASE_BRANCH}..HEAD > modified_tests.args;
./scripts/ci/get_modified_boards.py --commits origin/${PULL_REQUEST_BASE_BRANCH}..HEAD > modified_boards.args;
if [ -s modified_boards.args ]; then
./scripts/sanitycheck --subset ${MATRIX_BUILD}/${MATRIX_BUILDS_EXTRA} +modified_boards.args || ./scripts/sanitycheck +modified_boards.args --only-failed;
cp ./scripts/sanity_chk/last_sanity.xml modified_boards.xml;
fi;
if [ -s modified_tests.args ]; then
./scripts/sanitycheck --subset ${MATRIX_BUILD}/${MATRIX_BUILDS_EXTRA} +modified_tests.args || ./scripts/sanitycheck +modified_tests.args --only-failed;
cp ./scripts/sanity_chk/last_sanity.xml modified_tests.xml;
fi;
rm -f modified_tests.args modified_boards.args;
fi
- >
if [ "$MATRIX_BUILD" != "3" ]; then
./scripts/sanitycheck ${PLATFORMS} --subset ${MATRIX_BUILD}/${MATRIX_BUILDS} ${COVERAGE} ${SANITYCHECK_OPTIONS} || ./scripts/sanitycheck ${PLATFORMS} ${COVERAGE} ${SANITYCHECK_OPTIONS_RETRY} || ./scripts/sanitycheck ${PLATFORMS} ${COVERAGE} ${SANITYCHECK_OPTIONS_RETRY}
fi;
- ccache -s
on_failure:
- rm -rf sanity-out out-2nd-pass
- mkdir -p shippable/testresults
- >
if [ -e compliance.xml ]; then
cp compliance.xml shippable/testresults/;
fi;
- >
if [ -e ./scripts/sanity_chk/last_sanity.xml ]; then
cp ./scripts/sanity_chk/last_sanity.xml shippable/testresults/;
fi;
- >
if [ -e ./modified_tests.xml ]; then
cp ./modified_tests.xml shippable/testresults/;
fi;
on_success:
- rm -rf sanity-out out-2nd-pass
- mkdir -p shippable/testresults
- >
if [ -e compliance.xml ]; then
cp compliance.xml shippable/testresults/;
fi;
- >
if [ -e ./scripts/sanity_chk/last_sanity.xml ]; then
cp ./scripts/sanity_chk/last_sanity.xml shippable/testresults/;
fi;
- >
if [ -e ./modified_tests.xml ]; then
cp ./modified_tests.xml shippable/testresults/;
fi;
integrations:
notifications:
- integrationName: slack_integration
type: slack
recipients:
- "#ci"
branches:
only:
- master
on_success: never
on_failure: always
- integrationName: email
type: email
recipients:
- builds@zephyrproject.org
branches:
only:
- master
- net
- bluetooth
- arm
on_success: never
on_failure: never

View File

@@ -1,16 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
extends: default
rules:
line-length:
max: 100
comments:
min-spaces-from-content: 1
indentation:
spaces: 2
indent-sequences: consistent
document-start:
present: false
truthy:
check-keys: false

File diff suppressed because it is too large Load Diff

177
CODEOWNERS Normal file
View File

@@ -0,0 +1,177 @@
# CODEOWNERS for autoreview assigning in github
# Do not use wildcard on all source yet
# * @galak @nashif
# Get all docs reviewed
*.rst @dbkinder
.known-issues/* @inakypg @nashif
MAINTAINERS @inakypg @nashif
arch/arc/* @cjordan44 @ruuddw
arch/arc/core/* @andrewboie
arch/arm/* @MaureenHelm @galak
arch/arm/core/* @andrewboie
arch/arm/soc/arm/mps2/* @fvincenzo
arch/arm/soc/st_stm32/* @erwango
arch/arm/soc/st_stm32/stm32f4/* @rsalveti @idlethread
arch/arm/soc/ti_simplelink/cc32xx @GAnthony
arch/nios2/* @andrewboie
arch/nios2/core/* @andrewboie
arch/riscv32 @fractalclone
arch/x86/* @andrewboie @youvedeep
arch/x86/core/* @andrewboie
arch/x86/core/crt0.S @youvedeep @nashif
arch/x86/soc/intel_quark/quark_d2000/* @nashif
arch/x86/soc/intel_quark/quark_se/* @nashif
arch/x86/soc/intel_quark/quark_x1000/* @nashif
arch/xtensa/* @andrewboie
boards/arc/* @cjordan44 @ruuddw
boards/arc/arduino_101_sss/* @nashif
boards/arc/em_starterkit/* @cjordan44
boards/arc/quark_se_c1000_ss_devboard/* @nashif
boards/arm/* @MaureenHelm @galak
boards/arm/96b_carbon/* @rsalveti @idlethread
boards/arm/96b_nitrogen/* @idlethread
boards/arm/arduino_101_ble/* @jhedberg
boards/arm/cc3220sf_launchxl/* @GAnthony
boards/arm/disco_l475_iot1/* @erwango
boards/arm/frdm_k64f/* @MaureenHelm
boards/arm/frdm_kw41z/* @MaureenHelm
boards/arm/hexiwear_k64/* @MaureenHelm
boards/arm/mps2_an385/* @fvincenzo
boards/arm/nrf51_blenano/* @rsalveti
boards/arm/nrf52_pca10040/* @carlescufi
boards/arm/nucleo_f401re/* @rsalveti @idlethread
boards/arm/v2m_beetle/* @fvincenzo
boards/arm/olimexino_stm32/* @ydamigos
boards/arm/stm32f3_disco/* @ydamigos
boards/nios2/* @andrewboie
boards/nios2/altera_max10/* @andrewboie
boards/riscv32 @fractalclone
boards/x86/* @andrewboie @youvedeep
boards/x86/arduino_101/* @nashif
boards/x86/galileo/* @nashif
boards/x86/quark_d2000_crb/* @nashif
boards/x86/quark_se_c1000_devboard/* @nashif
boards/xtensa/* @andrewboie
doc/* @dbkinder
doc/subsystems/bluetooth/* @sjanc @jhedberg @Vudentz
drivers/*/*qmsi* @nashif
drivers/*/*stm32* @erwango
drivers/bluetooth/* @sjanc @jhedberg @Vudentz
drivers/clock_control/*stm32f4* @rsalveti @idlethread
drivers/ethernet/* @jukkar @tbursztyka
drivers/flash/* @nashif
drivers/gpio/*stm32* @rsalveti @idlethread
drivers/gpio/gpio_pulpino.c @fractalclone
drivers/ieee802154/* @jukkar @tbursztyka
drivers/interrupt_controller/* @andrewboie
drivers/pinmux/stm32/* @rsalveti @idlethread
drivers/sensor/* @bogdan-davidoaia
drivers/serial/uart_altera_jtag.c @andrewboie
drivers/serial/uart_riscv_qemu.c @fractalclone
drivers/slip/* @jukkar @tbursztyka
drivers/spi/* @tbursztyka
drivers/spi/spi_ll_stm32.* @superna9999 @mbolivar
drivers/timer/altera_avalon_timer.c @andrewboie
drivers/timer/pulpino_timer.c @fractalclone
drivers/timer/riscv_machine_timer.c @fractalclone
drivers/usb @youvedeep @andyross
drivers/i2c/i2c_ll_stm32* @ldts @ydamigos
dts/arm/st/* @erwango
ext/fs/* @nashif
ext/hal/cmsis/* @MaureenHelm @galak
ext/hal/nordic/mdk/* @carlescufi
ext/hal/nxp/mcux/* @MaureenHelm
ext/hal/qmsi/* @nashif
ext/hal/st/stm32cube/* @erwango
ext/lib/crypto/mbedtls/* @nashif
ext/lib/crypto/tinycrypt/* @lpereira
include/arch/arc/* @cjordan44 @ruuddw
include/arch/arc/arch.h @andrewboie
include/arch/arc/v2/irq.h @andrewboie
include/arch/arm/* @MaureenHelm @galak
include/arch/arm/cortex_m/irq.h @andrewboie
include/arch/nios2/* @andrewboie
include/arch/nios2/arch.h @andrewboie
include/arch/riscv32 @fractalclone
include/arch/x86/* @andrewboie @youvedeep
include/arch/x86/arch.h @andrewboie
include/arch/xtensa/* @andrewboie
include/atomic.h @andrewboie @andyross
include/bluetooth/* @sjanc @jhedberg @Vudentz
include/cache.h @andrewboie @andyross
include/device.h @youvedeep @nashif
include/drivers/bluetooth/* @sjanc @jhedberg @Vudentz
include/drivers/ioapic.h @andrewboie
include/drivers/loapic.h @andrewboie
include/drivers/mvic.h @andrewboie
include/fs.h @nashif
include/fs/* @nashif
include/init.h @andrewboie @andyross
include/init.h @youvedeep @nashif
include/irq.h @andrewboie
include/irq.h @andrewboie @andyross
include/irq_offload.h @andrewboie @andyross
include/kernel.h @andrewboie @andyross
include/kernel_version.h @andrewboie @andyross
include/linker/linker-defs.h @andrewboie @andyross
include/linker/linker-tool-gcc.h @andrewboie @andyross
include/linker/linker-tool.h @andrewboie @andyross
include/linker/section_tags.h @andrewboie @andyross
include/linker/sections.h @andrewboie @andyross
include/misc/* @andrewboie @andyross
include/net/* @jukkar @tbursztyka
include/net/buf.h @jukkar @jhedberg @tbursztyka
include/power.h @youvedeep @nashif
include/sensor.h @bogdan-davidoaia
include/shared_irq.h @andrewboie @andyross
include/spi.h @tbursztyka
include/sw_isr_table.h @andrewboie @andyross
include/sys_clock.h @andrewboie @andyross
include/sys_io.h @andrewboie @andyross
include/toolchain.h @andrewboie @andyross
include/toolchain/* @andrewboie @andyross
include/zephyr.h @andrewboie @andyross
kernel/* @andrewboie @andyross
kernel/device.c @youvedeep @nashif
kernel/idle.c @youvedeep @nashif
samples/bluetooth/* @sjanc @jhedberg @Vudentz
samples/boards/quark_se_c1000/power*/* @youvedeep @nashif
samples/net/* @jukkar @tbursztyka
samples/net/dns_resolve/* @jukkar @tbursztyka
samples/net/http_server/* @jukkar @tbursztyka
samples/net/lwm2m_client/* @mike-scott
samples/net/mbedtls_sslclient/* @nashif
samples/net/mqtt_publisher/* @jukkar @tbursztyka
samples/net/zoap_client/* @vcgomes
samples/net/zoap_server/* @vcgomes
samples/sensor/* @bogdan-davidoaia
samples/subsys/usb @youvedeep @andyross
scripts/expr_parser.py @andrewboie
scripts/sanity_chk/* @andrewboie
scripts/sanitycheck @andrewboie
subsys/bluetooth/* @sjanc @jhedberg @Vudentz
subsys/bluetooth/controller/* @carlescufi @cvinayak
subsys/fs/* @nashif
subsys/net/buf.c @jukkar @jhedberg @tbursztyka
subsys/net/ip/* @jukkar @tbursztyka
subsys/net/lib/* @jukkar @tbursztyka
subsys/net/lib/dns/* @jukkar @tbursztyka
subsys/net/lib/http/* @jukkar @tbursztyka
subsys/net/lib/lwm2m/* @mike-scott
subsys/net/lib/mqtt/* @jukkar @tbursztyka
subsys/net/lib/zoap/* @vcgomes
subsys/usb @youvedeep @andyross
tests/bluetooth/* @sjanc @jhedberg @Vudentz
tests/crypto/* @lpereira
tests/crypto/mbedtls/* @nashif
tests/drivers/spi/* @tbursztyka
tests/kernel/* @andrewboie @andyross
tests/net/* @jukkar @tbursztyka
tests/net/buf/* @jukkar @jhedberg @tbursztyka
tests/net/lib/* @jukkar @tbursztyka
tests/net/lib/http_header_fields/* @jukkar @tbursztyka
tests/net/lib/mqtt_packet/* @jukkar @tbursztyka
tests/net/lib/zoap/* @vcgomes
tests/subsys/fs/* @nashif

View File

@@ -1,139 +0,0 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, caste, color, religion, or sexual
identity and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the overall
community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or advances of
any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email address,
without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
conduct@zephyrproject.org. Reports will be received by the Chair of the Zephyr
Governing Board, the Zephyr Project Director (Linux Foundation), and the Zephyr
Project Developer Advocate (Linux Foundation). You may refer to the [Governing
Board](https://zephyrproject.org/governing-board/) and [Linux Foundation
Staff](https://zephyrproject.org/staff/) web pages to identify who are the
individuals currently holding these positions.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series of
actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or permanent
ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within the
community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.1, available at
[https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1].
The only changes made by The Zephyr Project to the original document were to
make explicit who the recipients of Code of Conduct incident reports are.
Community Impact Guidelines were inspired by
[Mozilla's code of conduct enforcement ladder][Mozilla CoC].
For answers to common questions about this code of conduct, see the FAQ at
[https://www.contributor-covenant.org/faq][FAQ]. Translations are available at
[https://www.contributor-covenant.org/translations][translations].
[homepage]: https://www.contributor-covenant.org
[v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html
[Mozilla CoC]: https://github.com/mozilla/diversity
[FAQ]: https://www.contributor-covenant.org/faq
[translations]: https://www.contributor-covenant.org/translations

View File

@@ -6,34 +6,475 @@ patches directly to the project. In our collaborative open source environment,
standards and methods for submitting changes help reduce the chaos that can result
from an active development community.
This document briefly summarizes the full `Contribution
Guidelines <http://docs.zephyrproject.org/latest/contribute/index.html>`_
documentation.
This document explains how to participate in project conversations, log bugs
and enhancement requests, and submit patches to the project so your patch will
be accepted quickly in the codebase.
* Zephyr uses the permissive open source `Apache 2.0 license`_
that allows you to freely use, modify, distribute and sell your own products
that include Apache 2.0 licensed software.
* There are some imported or reused components of the Zephyr project that
use other licensing and are clearly identified.
Licensing
*********
* The Developer Certificate of Origin (DCO) process is followed to
ensure developers are following licensing criteria for their
contributions, and documented with a ``Signed-off-by`` line in commits.
Licensing is very important to open source projects. It helps ensure the
software continues to be available under the terms that the author desired.
* Zephyr development workflow is supported on Linux, macOS, and Windows,
(with a few exceptions).
.. _Apache 2.0 license:
https://github.com/zephyrproject-rtos/zephyr/blob/master/LICENSE
* Source code for the project is maintained in the GitHub repo:
https://github.com/zephyrproject-rtos/zephyr
.. _GitHub repo: https://github.com/zephyrproject-rtos/zephyr
* Issue and feature tracking is done using GitHub issues in this repo.
Zephyr uses the `Apache 2.0 license`_ (as found in the LICENSE file in the
project's `GitHub repo`_) to strike a balance between open contribution and
allowing you to use the software however you would like to. There are some
imported or reused components of the Zephyr project that use other licensing,
as described in `Zephyr Licensing`_.
* A Continuous Integration (CI) system runs on every Pull Request (PR)
to verify several aspects of the PR including Git commit formatting,
Coding Style, sanity checks builds, and documentation builds.
.. _Zephyr Licensing:
https://www.zephyrproject.org/doc/LICENSING.html
* The `Zephyr devel mailing list`_ is a great place to engage with the
community, ask questions, discuss issues, and help each other.
The license tells you what rights you have as a developer, provided by the
copyright holder. It is important that the contributor fully understands the
licensing rights and agrees to them. Sometimes the copyright holder isn't the
contributor, such as when the contributor is doing work on behalf of a
company.
.. _Zephyr devel mailing list: https://lists.zephyrproject.org/g/devel
.. _DCO:
Developer Certification of Origin (DCO)
***************************************
To make a good faith effort to ensure licensing criteria are met, the Zephyr
project requires the Developer Certificate of Origin (DCO) process to be
followed.
The DCO is an attestation attached to every contribution made by every
developer. In the commit message of the contribution, (described more fully
later in this document), the developer simply adds a ``Signed-off-by``
statement and thereby agrees to the DCO.
When a developer submits a patch, it is a commitment that the contributor has
the right to submit the patch per the license. The DCO agreement is shown
below and at http://developercertificate.org/.
.. code-block:: none
Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the
best of my knowledge, is covered under an appropriate open
source license and I have the right under that license to
submit that work with modifications, whether created in whole
or in part by me, under the same open source license (unless
I am permitted to submit under a different license), as
Indicated in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including
all personal information I submit with it, including my
sign-off) is maintained indefinitely and may be redistributed
consistent with this project or the open source license(s)
involved.
DCO Sign-Off Methods
====================
The DCO requires a sign-off message in the following format appear on each
commit in the pull request::
Signed-off-by: Zephyrus Zephyr <zephyrus@zephyrproject.org>
The DCO text can either be manually added to your commit body, or you can add
either ``-s`` or ``--signoff`` to your usual Git commit commands. If you forget
to add the sign-off you can also amend a previous commit with the sign-off by
running ``git commit --amend -s``. If you've pushed your changes to GitHub
already you'll need to force push your branch after this with ``git push -f``.
Prerequisites
*************
.. _Zephyr Project website: https://zephyrproject.org
As a contributor, you'll want to be familiar with the Zephyr project, how to
configure, install, and use it as explained in the `Zephyr Project website`_
and how to set up your development environment as introduced in the Zephyr
`Getting Started Guide`_.
.. _Getting Started Guide:
https://www.zephyrproject.org/doc/getting_started/getting_started.html
The examples below use a Linux host environment for Zephyr development.
You should be familiar with common developer tools such as Git and Make, and
platforms such as GitHub.
If you haven't already done so, you'll need to create a (free) GitHub account
on http://github.com and have Git tools available on your development system.
Repository layout
*****************
To clone the main Zephyr Project repository use::
$ git clone https://github.com/zephyrproject-rtos/zephyr
The Zephyr project directory structure is described in `Source Tree Structure`_
documentation. In addition to the Zephyr kernel itself, you'll also find the
sources for technical documentation, sample code, supported board
configurations, and a collection of subsystem tests. All of these are
available for developers to contribute to and enhance.
.. _Source Tree Structure:
https://www.zephyrproject.org/doc/kernel/overview/source_tree.html
Pull Requests and Issues
************************
.. _Zephyr Project Issues: https://jira.zephyrproject.org
.. _open pull requests: https://github.com/zephyrproject-rtos/zephyr/pulls
.. _Zephyr-devel mailing list:
https://lists.zephyrproject.org/mailman/listinfo/zephyr-devel
Before starting on a patch, first check in our Jira `Zephyr Project Issues`_
system to see what's been reported on the issue you'd like to address. Have a
conversation on the `Zephyr-devel mailing list`_ (or the #zephyrproject IRC
channel on freenode.net) to see what others think of your issue (and proposed
solution). You may find others that have encountered the issue you're
finding, or that have similar ideas for changes or additions. Send a message
to the `Zephyr-devel mailing list`_ to introduce and discuss your idea with
the development community.
It's always a good practice to search for existing or related issues before
submitting your own. When you submit an issue (bug or feature request), the
triage team will review and comment on the submission, typically within a few
business days.
You can find all `open pull requests`_ on GitHub and open `Zephyr Project
Issues`_ in Jira.
Development Tools and Git Setup
*******************************
Signed-off-by
=============
The name in the commit message ``Signed-off-by:`` line and your email must
match the change authorship information. Make sure your *.git/config* is set
up correctly::
$ git config --global user.name "David Developer"
$ git config --global user.email "david.developer@company.com"
gitlint
=========
When you submit a pull request to the project, a series of checks are
performed to verify your commit messages meet the requirements. The same step
done during the CI process can be performed locally using the the *gitlint*
command.
Install gitlint and run it locally in your tree and branch where your patches
have been committed::
$ sudo pip3 install gitlint
$ gitlint
Note, gitlint only checks HEAD (the most recent commit), so you should run it
after each commit, or use the ``--commits`` option to specify a commit range
covering all the development patches to be submitted.
sanitycheck
===========
To verify that your changes did not break any tests or samples, please run the
``sanitycheck`` script locally before submitting your pull request to GitHub. To
run the same tests the CI system runs, follow these steps from within your
local Zephyr source working directory::
$ source zephyr-env.sh
$ make host-tools
$ export PREBUILT_HOST_TOOLS=${ZEPHYR_BASE}/bin
$ export USE_CCACHE=1
$ ./scripts/sanitycheck
The above will execute the basic sanitycheck script, which will run various
kernel tests using the QEMU emulator. It will also do some build tests on
various samples with advanced features that can't run in QEMU.
We highly recommend you run these tests locally to avoid any CI failures.
Using CCACHE and pre-built host tools is optional, however it speeds up the
execution time considerably.
Coding Style
************
Use these coding guidelines to ensure that your development complies with the
project's style and naming conventions.
.. _Linux kernel coding style:
https://kernel.org/doc/html/latest/process/coding-style.html
In general, follow the `Linux kernel coding style`_, with the
following exceptions:
* Add braces to every ``if`` and ``else`` body, even for single-line code
blocks. Use the ``--ignore BRACES`` flag to make *checkpatch* stop
complaining.
* Use spaces instead of tabs to align comments after declarations, as needed.
* Use C89-style single line comments, ``/* */``. The C99-style single line
comment, ``//``, is not allowed.
* Use ``/** */`` for doxygen comments that need to appear in the documentation.
The Linux kernel GPL-licensed tool ``checkpatch`` is used to check coding
style conformity. Checkpatch is available in the scripts directory. To invoke
it when committing code, edit your *.git/hooks/pre-commit* file to contain:
.. code-block:: bash
#!/bin/sh
set -e exec
exec git diff --cached | ${ZEPHYR_BASE}/scripts/checkpatch.pl - || true
Contribution Workflow
*********************
One general practice we encourage, is to make small,
controlled changes. This practice simplifies review, makes merging and
rebasing easier, and keeps the change history clear and clean.
When contributing to the Zephyr Project, it is also important you provide as much
information as you can about your change, update appropriate documentation,
and test your changes thoroughly before submitting.
The general GitHub workflow used by Zephyr developers uses a combination of
command line Git commands and browser interaction with GitHub. As it is with
Git, there are multiple ways of getting a task done. We'll describe a typical
workflow here:
.. _Create a Fork of Zephyr:
https://github.com/zephyrproject-rtos/zephyr#fork-destination-box
#. `Create a Fork of Zephyr`_
to your personal account on GitHub. (Click on the fork button in the top
right corner of the Zephyr project repo page in GitHub.)
#. On your development computer, clone the fork you just made::
$ git clone https://github.com/<your github id>/zephyr
This would be a good time to let Git know about the upstream repo too::
$ git remote add upstream https://github.com/zephyrproject-rtos/zephyr.git
and verify the remote repos::
$ git remote -v
#. Create a topic branch (off of master) for your work (if you're addressing
Jira issue, we suggest including the Jira issue number in the branch name)::
$ git checkout master
$ git checkout -b fix_comment_typo
Some Zephyr subsystems do development work on a separate branch from
master so you may need to indicate this in your checkout::
$ git checkout -b fix_out_of_date_patch origin/net
#. Make changes, test locally, change, test, test again, ... (Check out the
prior chapter on `sanitycheck`_ as well).
#. When things look good, start the pull request process by adding your changed
files::
$ git add [file(s) that changed, add -p if you want to be more specific]
You can see files that are not yet staged using::
$ git status
#. Verify changes to be committed look as you expected::
$ git diff --cached
#. Commit your changes to your local repo::
$ git commit -s
The ``-s`` option automatically adds your ``Signed-off-by:`` to your commit
message. Your commit will be rejected without this line that indicates your
agreement with the `DCO`_. See the `Commit Guidelines`_ section for
specific guidelines for writing your commit messages.
#. Push your topic branch with your changes to your fork in your personal
GitHub account::
$ git push origin fix_comment_typo
#. In your web browser, go to your forked repo and click on the
``Compare & pull request`` button for the branch you just worked on and
you want to open a pull request with.
#. Review the pull request changes, and verify that you are opening a
pull request for the appropriate branch. The title and message from your
commit message should appear as well.
#. If you're working on a subsystem branch that's not ``master``,
you may need to change the intended branch for the pull request
here, for example, by changing the base branch from ``master`` to ``net``.
#. GitHub will assign one or more suggested reviewers (based on the
CODEOWNERS file in the repo). If you are a project member, you can
select additional reviewers now too.
#. Click on the submit button and your pull request is sent and awaits
review. Email will be sent as review comments are made, or you can check
on your pull request at https://github.com/zephyrproject-rtos/zephyr/pulls.
#. While you're waiting for your pull request to be accepted and merged, you
can create another branch to work on another issue. (Be sure to make your
new branch off of master and not the previous branch.)::
$ git checkout master
$ git checkout -b fix_another_issue
and use the same process described above to work on this new topic branch.
#. If reviewers do request changes to your patch, you can interactively rebase
commit(s) to fix review issues. In your development repo::
$ git fetch --all
$ git rebase --ignore-whitespace upstream/master
The ``--ignore-whitespace`` option stops ``git apply`` (called by rebase)
from changing any whitespace. Continuing::
$ git rebase -i <offending-commit-id>^
In the interactive rebase editor, replace ``pick`` with ``edit`` to select
a specific commit (if there's more than one in your pull request), or
remove the line to delete a commit entirely. Then edit files to fix the
issues in the review.
As before, inspect and test your changes. When ready, continue the
patch submission::
$ git add [file(s)]
$ git rebase --continue
Update commit comment if needed, and continue::
$ git push --force origin fix_comment_typo
By force pushing your update, your original pull request will be updated
with your changes so you won't need to resubmit the pull request.
Commit Guidelines
*****************
Changes are submitted as Git commits. Each commit message must contain:
* A short and descriptive subject line that is less than 72 characters,
followed by a blank line. The subject line must include a prefix that
identifies the subsystem being changed, followed by a colon, and a short
title, for example: ``doc: update wiki references to new site``.
(If you're updating an existing file, you can use
``git log <filename>`` to see what developers used as the prefix for
previous patches of this file.)
* A change description with your logic or reasoning for the changes, followed
by a blank line.
* A Signed-off-by line, ``Signed-off-by: <name> <email>`` typically added
automatically by using ``git commit -s``
* If the change address a Jira issue, include a line of the form::
Jira: ZEP-xxx
All changes and topics sent to GitHub must be well-formed, as described above.
Commit Message Body
===================
When editing the commit message, please briefly explain what your change
does and why it's needed. A change summary of ``"Fixes stuff"`` will be rejected. An
empty change summary is only acceptable for trivial changes fully described by
the commit title (e.g., ``doc: fix misspellings in CONTRIBUTING doc``)
The description body of the commit message must include:
* **what** the change does,
* **why** you chose that approach,
* **what** assumptions were made, and
* **how** you know it works -- for example, which tests you ran.
For examples of accepted commit messages, you can refer to the Zephyr GitHub
`changelog <https://github.com/zephyrproject-rtos/zephyr/commits/master>`__.
Other Commit Expectations
=========================
* Commits must build cleanly when applied on top of each other, thus avoiding
breaking bisectability.
* Commits must pass the *scripts/checkpatch.pl* requirements.
* Each commit must address a single identifiable issue and must be
logically self-contained. Unrelated changes should be submitted as
separate commits.
* You may submit pull request RFCs (requests for comments) to send work
proposals, progress snapshots of your work, or to get early feedback on
features or changes that will affect multiple areas in the code base.
Identifying Contribution Origin
===============================
When adding a new file to the tree, it is important to detail the source of
origin on the file, provide attributions, and detail the intended usage. In
cases where the file is an original to Zephyr, the commit message should
include the following ("Original" is the assumption if no Origin tag is
present)::
Origin: Original
In cases where the file is imported from an external project, the commit
message shall contain details regarding the original project, the location of
the project, the SHA-id of the origin commit for the file, the intended
purpose, and if the file will be maintained by the Zephyr project,
(whether or not the Zephyr project will contain a localized branch or if
it is a downstream copy).
For example, a copy of a locally maintained import::
Origin: Contiki OS
License: BSD 3-Clause
URL: http://www.contiki-os.org/
commit: 853207acfdc6549b10eb3e44504b1a75ae1ad63a
Purpose: Introduction of networking stack.
Maintained-by: Zephyr
For example, a copy of an externally maintained import::
Origin: Tiny Crypt
License: BSD 3-Clause
URL: https://github.com/01org/tinycrypt
commit: 08ded7f21529c39e5133688ffb93a9d0c94e5c6e
Purpose: Introduction of TinyCrypt
Maintained-by: External

56
Kbuild Normal file
View File

@@ -0,0 +1,56 @@
# vim: filetype=make
define filechk_configs.c
(echo "/* file is auto-generated, do not modify ! */"; \
echo; \
echo "#include <toolchain.h>"; \
echo; \
echo "GEN_ABS_SYM_BEGIN (_ConfigAbsSyms)"; \
echo; \
cat $(CURDIR)/include/generated/autoconf.h | sed \
's/".*"/1/' | awk \
'/#define/{printf "GEN_ABSOLUTE_SYM(%s, %s);\n", $$2, $$3}'; \
echo; \
echo "GEN_ABS_SYM_END";)
endef
misc/generated/configs.c: include/config/auto.conf FORCE
$(call filechk,configs.c)
targets := misc/generated/configs.c
targets += include/generated/offsets.h
always := misc/generated/configs.c
always += include/generated/offsets.h
define rule_cc_o_c_1
$(call echo-cmd,cc_o_c_1) $(cmd_cc_o_c_1);
endef
cmd_cc_o_c_1 = $(CC) $(KBUILD_CPPFLAGS) $(KBUILD_CFLAGS) $(ZEPHYRINCLUDE) -c -o $@ $<
arch/$(ARCH)/core/offsets/offsets.o: arch/$(ARCH)/core/offsets/offsets.c $(KCONFIG_CONFIG) \
include/generated/generated_dts_board.h
$(Q)mkdir -p $(dir $@)
$(call if_changed,cc_o_c_1)
define offsetchk
$(Q)set -e; \
$(kecho) ' CHK $@'; \
mkdir -p $(dir $@); \
$(srctree)/scripts/gen_offset_header.py -i $(1) -o $@.tmp; \
if [ -r $@ ] && cmp -s $@ $@.tmp; then \
rm -f $@.tmp; \
else \
$(kecho) ' UPD $@'; \
mv -f $@.tmp $@; \
fi
endef
include/generated/offsets.h: arch/$(ARCH)/core/offsets/offsets.o \
include/config/auto.conf FORCE
$(call offsetchk,arch/$(ARCH)/core/offsets/offsets.o)

View File

@@ -1,8 +1,10 @@
# General configuration options
# Kconfig - general configuration options
#
# Copyright (c) 2014-2015 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
mainmenu "Zephyr Kernel Configuration"
source "Kconfig.zephyr"

View File

@@ -1,19 +0,0 @@
# Constant variables to be used across Kconfig options
# Copyright (c) 2024 basalte bv
# SPDX-License-Identifier: Apache-2.0
INT8_MIN := -128
INT16_MIN := -32768
INT32_MIN := -2147483648
INT64_MIN := -9223372036854775808
INT8_MAX := 127
INT16_MAX := 32767
INT32_MAX := 2147483647
INT64_MAX := 9223372036854775807
UINT8_MAX := 255
UINT16_MAX := 65535
UINT32_MAX := 4294967295
UINT64_MAX := 18446744073709551615

File diff suppressed because it is too large Load Diff

493
MAINTAINERS Normal file
View File

@@ -0,0 +1,493 @@
Originally from the Linux Kernel.
# Licensed under the terms of the GNU GPL License version 2
Descriptions of section entries:
P: Person (obsolete)
M: Mail patches to: FullName <address@domain>
R: Designated reviewer: FullName <address@domain>
These reviewers should be CCed on patches.
L: Mailing list that is relevant to this area
W: Web-page with status/info
Q: Patchwork web based patch tracking system site
T: SCM tree type and location.
Type is one of: git, hg, quilt, stgit, topgit
S: Status, one of the following:
Supported: Someone is actually paid to look after this.
Maintained: Someone actually looks after it.
Odd Fixes: It has a maintainer but they don't have time to do
much other than throw the odd patch in. See below..
Orphan: No current maintainer [but maybe you could take the
role as you write your new code].
Obsolete: Old code. Something tagged obsolete generally means
it has been replaced by a better system and you
should be using that.
F: Files and directories with wildcard patterns.
A trailing slash includes all files and subdirectory files.
F: drivers/net/ all files in and below drivers/net
F: drivers/net/* all files in drivers/net, but not below
F: */net/* all files in "any top level directory"/net
One pattern per line. Multiple F: lines acceptable.
N: Files and directories with regex patterns.
N: [^a-z]tegra all files whose path contains the word tegra
One pattern per line. Multiple N: lines acceptable.
scripts/get_maintainer.pl has different behavior for files that
match F: pattern and matches of N: patterns. By default,
get_maintainer will not look at git log history when an F: pattern
match occurs. When an N: match occurs, git log history is used
to also notify the people that have git commit signatures.
X: Files and directories that are NOT maintained, same rules as F:
Files exclusions are tested before file matches.
Can be useful for excluding a specific subdirectory, for instance:
F: net/
X: net/ipv6/
matches all files in and below net excluding net/ipv6/
K: Keyword perl extended regex pattern to match content in a
patch or file. For instance:
K: of_get_profile
matches patches or files that contain "of_get_profile"
K: \b(printk|pr_(info|err))\b
matches patches or files that contain one or more of the words
printk, pr_info or pr_err
One regex pattern per line. Multiple K: lines acceptable.
Note: For the hard of thinking, this list is meant to remain in alphabetical
order. If you could add yourselves to it in alphabetical order that would be
so much easier [Ed]
Maintainers List (try to look for most precise areas first)
-----------------------------------
ARC ARCHITECTURE
M: Ruud Derwig <Ruud.Derwig@synopsys.com>
M: Chuck Jordan <cjordan@synopsys.com>
S: Supported
F: arch/arc/
F: include/arch/arc/
F: boards/arc/
ARM ARCHITECTURE
M: Maureen Helm <maureen.helm@nxp.com>
M: Kumar Gala <kumar.gala@linaro.org>
S: Supported
F: arch/arm/
F: include/arch/arm/
F: boards/arm/
ARM CORTEX MICROCONTROLLER SOFTWARE INTERFACE STANDARD (CMSIS)
M: Maureen Helm <maureen.helm@nxp.com>
M: Kumar Gala <kumar.gala@linaro.org>
S: Supported
F: ext/hal/cmsis/
BOARDS/ARC - ARDUINO 101 SSS
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/arc/arduino_101_sss/
BOARDS/ARC - EM Starterkit
M: Chuck Jordan <cjordan@synopsys.com>
S: Supported
F: boards/arc/em_starterkit/
BOARDS/ARC - QUARK SE C1000 SS Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/arc/quark_se_c1000_ss_devboard/
BOARDS/ARM - 96Boards CARBON
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/96b_carbon/
BOARDS/ARM - 96Boards NITROGEN
M: Amit Kucheria <amit.kucheria@linaro.org>
S: Supported
F: boards/arm/96b_nitrogen/
BOARDS/ARM - ARDUINO 101 BLE
M: Johan Hedberg <johan.hedberg@intel.com>
S: Supported
F: boards/arm/arduino_101_ble/
BOARDS/ARM - CC32XX LAUNCHXL
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: boards/arm/cc3220sf_launchxl/
BOARDS/ARM - NXP FRDM-K64F
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: boards/arm/frdm_k64f/
BOARDS/ARM - NXP FRDM-KW41Z
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: boards/arm/frdm_kw41z/
BOARDS/ARM - NXP Hexiwear
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: boards/arm/hexiwear_k64/
BOARDS/ARM - NORDIC NRF51 REDBEAR BLENANO
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/nrf51_blenano/
BOARDS/ARM - NORDIC NRF52 PCA10040
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: boards/arm/nrf52_pca10040/
BOARDS/ARM - NUCLEO-64 F401RE Devboard
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/nucleo_f401re/
BOARDS/ARM - ARM LTD V2M Beetle
M: Vincenzo Frascino <vincenzo.frascino@linaro.org>
S: Supported
F: boards/arm/v2m_beetle/
BOARDS/NIOS2 - ALTERA MAX10
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: boards/nios2/altera_max10/
BOARDS/X86 - ARDUINO 101
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/arduino_101/
BOARDS/X86 - Galileo
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/galileo/
BOARDS/X86 - QUARK D2000 Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/quark_d2000_crb/
BOARDS/X86 - QUARK SE C1000 Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/quark_se_c1000_devboard/
BLUETOOTH
M: Johan Hedberg <johan.hedberg@intel.com>
M: Luiz Augusto von Dentz <luiz.dentz@gmail.com>
M: Szymon Janc <szymon.janc@gmail.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/bluetooth/bluetooth.html
F: subsys/bluetooth/
F: include/bluetooth/
F: include/drivers/bluetooth/
F: drivers/bluetooth/
F: samples/bluetooth/
F: tests/bluetooth/
F: doc/subsystems/bluetooth/
BLUETOOTH CONTROLLER
M: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: subsys/bluetooth/controller/
CC32XX SDK
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: ext/hal/ti/simplelink
CC32XX SOC - TI SIMPLELINK
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: arch/arm/soc/ti_simplelink/cc32xx
DOCUMENTATION
M: David Kinder <david.b.kinder@intel.com>
M: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
S: Supported
F: doc/
F: *.rst
FILE SYSTEM
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: ext/fs/
F: subsys/fs/
F: include/fs/
F: include/fs.h
F: tests/subsys/fs
FLASH DRIVER
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: drivers/flash/
INTERRUPTS
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: drivers/interrupt_controller/
F: arch/arc/core/
F: arch/arm/core/
F: arch/nios2/core/
F: arch/x86/core/
F: include/irq.h
F: include/arch/x86/arch.h
F: include/arch/arm/cortex_m/irq.h
F: include/arch/nios2/arch.h
F: include/arch/arc/arch.h
F: include/arch/arc/v2/irq.h
F: include/drivers/loapic.h
F: include/drivers/ioapic.h
F: include/drivers/mvic.h
KERNEL CORE
M: Andrew Boie <andrew.p.boie@intel.com>
M: Andy Ross <andrew.j.ross@intel.com>
S: Supported
F: kernel/
F: include/misc/
F: include/toolchain/
F: include/atomic.h
F: include/cache.h
F: include/init.h
F: include/irq.h
F: include/irq_offload.h
F: include/kernel_version.h
F: include/linker/linker-defs.h
F: include/linker/linker-tool-gcc.h
F: include/linker/linker-tool.h
F: include/linker/section_tags.h
F: include/linker/sections.h
F: include/shared_irq.h
F: include/sw_isr_table.h
F: include/sys_clock.h
F: include/sys_io.h
F: include/toolchain.h
F: include/zephyr.h
F: include/kernel.h
F: tests/kernel/
KNOWN ISSUES
M: Anas Nashif <anas.nashif@intel.com>
M: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
F: .known-issues/
LWM2M
M: Michael Scott <michael.scott@linaro.org>
S: Supported
F: subsys/net/lib/lwm2m/
F: samples/net/lwm2m_client/
MAINTAINERS
M: Anas Nashif <anas.nashif@intel.com>
M: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
S: Supported
F: MAINTAINERS
MBEDTLS
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: ext/lib/crypto/mbedtls/
F: samples/net/mbedtls_sslclient/
F: tests/crypto/mbedtls/
MCUXPRESSO SOFTWARE DEVELOPMENT KIT (MCUX)
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: ext/hal/nxp/mcux/
MPS2 - ARM LTD CORTEX-M PROTOTYPING SYSTEM
M: Vincenzo Frascino <vincenzo.frascino@linaro.org>
S: Supported
F: arch/arm/soc/arm/mps2/
F: boards/arm/mps2_an385/
NETWORKING
M: Jukka Rissanen <jukka.rissanen@linux.intel.com>
M: Tomasz Bursztyka <tomasz.bursztyka@linux.intel.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/networking/networking.html
F: subsys/net/ip/
F: subsys/net/lib/
F: include/net/
F: samples/net/
F: tests/net/
F: tests/net/lib/
F: drivers/ethernet/
F: drivers/ieee802154/
F: drivers/slip/
NETWORK APPLICATIONS
M: Jukka Rissanen <jukka.rissanen@linux.intel.com>
M: Tomasz Bursztyka <tomasz.bursztyka@linux.intel.com>
S: Supported
F: subsys/net/lib/dns/
F: subsys/net/lib/http/
F: subsys/net/lib/mqtt/
F: samples/net/dns_resolve/
F: samples/net/http_server/
F: samples/net/mqtt_publisher/
F: tests/net/lib/http_header_fields/
F: tests/net/lib/mqtt_packet/
NETWORK BUFFERS
M: Johan Hedberg <johan.hedberg@intel.com>
M: Jukka Rissanen <jukka.rissanen@linux.intel.com>
M: Tomasz Bursztyka <tomasz.bursztyka@linux.intel.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/networking/buffers.html
F: subsys/net/buf.c
F: include/net/buf.h
F: tests/net/buf/
NIOS II
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: arch/nios2/
F: include/arch/nios2/
F: drivers/serial/uart_altera_jtag.c
F: drivers/timer/altera_avalon_timer.c
F: boards/nios2/
NORDIC MDK
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: ext/hal/nordic/mdk/
POWER MANAGEMENT
M: Anas Nashif <anas.nashif@intel.com>
M: Youvedeep Singh <youvedeep.singh@intel.com>
S: Supported
F: arch/x86/core/crt0.S
F: include/device.h
F: include/init.h
F: include/power.h
F: kernel/idle.c
F: kernel/device.c
F: samples/boards/quark_se_c1000/power*/
QMSI
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: ext/hal/qmsi/
QMSI DRIVERS
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: drivers/*/*qmsi*
QUARK D2000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_d2000/
QUARK SE C1000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_se/
QUARK X1000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_x1000/
SANITYCHECK
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: scripts/sanitycheck
F: scripts/expr_parser.py
F: scripts/sanity_chk/
SENSOR DRIVERS
M: Bogdan Davidoaia <bogdan.davidoaia@gmail.com>
M: Murtaza Alexandru <murtaza.alexandru1995@gmail.com>
S: Maintained
W: https://www.zephyrproject.org/doc/subsystems/sensor.html
F: include/sensor.h
F: drivers/sensor/
F: samples/sensor/
STM32CUBE SDK
M: Erwan Gouriou <erwan.gouriou@linaro.org>
S: Supported
F: ext/hal/st/stm32cube/
STM32F4X SoC FAMILY and DRIVERS
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: arch/arm/soc/st_stm32/stm32f4/
F: drivers/pinmux/stm32/
F: drivers/gpio/*stm32*
F: drivers/clock_control/*stm32f4*
TINYCRYPT
M: Constanza Heath <constanza.m.heath@intel.com>
S: Supported
F: ext/lib/crypto/tinycrypt/
F: tests/crypto/
SPI
M: Tomasz Bursztyka <tomasz.bursztyka@linux.intel.com>
S: Supported
F: drivers/spi/
F: include/spi.h
F: tests/drivers/spi/
USB
M: Youvedeep Singh <youvedeep.singh@intel.com>
M: Andy Ross <andrew.j.ross@intel.com>
S: Supported
F: subsys/usb
F: drivers/usb
F: samples/subsys/usb
X86 ARCH
M: Andrew Boie <andrew.p.boie@intel.com>
M: Youvedeep Singh <youvedeep.singh@intel.com>
S: Supported
F: arch/x86/
F: include/arch/x86/
F: boards/x86/
XTENSA ARCH
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: arch/xtensa
F: include/arch/xtensa/
F: boards/xtensa/
RISCV32 ARCH
M: Jean-Paul Etienne <fractalclone@gmail.com>
S: Supported
F: arch/riscv32
F: include/arch/riscv32
F: boards/riscv32
F: drivers/serial/uart_riscv_qemu.c
F: drivers/timer/pulpino_timer.c
F: drivers/timer/riscv_machine_timer.c
F: drivers/gpio/gpio_pulpino.c
ZOAP
M: Vinicius Costa Gomes <vinicius.gomes@intel.com>
S: Supported
F: subsys/net/lib/zoap/
F: samples/net/zoap_client/
F: samples/net/zoap_server/
F: tests/net/lib/zoap/
THE REST
M: Anas Nashif <anas.nashif@intel.com>
M: Kumar Gala <kumar.gala@linaro.org>
L: devel@lists.zephyrproject.com
T: git https://github.com/zephyrproject-rtos/zephyr
S: Buried alive in reporters
F: *
F: */

File diff suppressed because it is too large Load Diff

1427
Makefile Normal file

File diff suppressed because it is too large Load Diff

148
Makefile.inc Normal file
View File

@@ -0,0 +1,148 @@
# vim: filetype=make
#
UNAME := $(shell uname)
ifeq (MINGW, $(findstring MINGW, $(UNAME)))
DQUOTE = '
# '
PROJECT_BASE ?= $(shell sh -c "pwd -W")
else
DQUOTE = "
# "
PROJECT_BASE ?= $(CURDIR)
endif
ifdef BOARD
KBUILD_DEFCONFIG_PATH=$(wildcard $(ZEPHYR_BASE)/boards/*/*/$(BOARD)_defconfig)
ifeq ($(KBUILD_DEFCONFIG_PATH),)
$(error Board $(BOARD) not found!)
endif
else
$(error BOARD is not defined!)
endif
# Choose a default output directory if one wasn't supplied. Note that
# PRISTINE_O depends on whether this is default or not. If building
# in-tree, we want to remove the whole outdir and not just the BOARD
# specified (thus "pristine"). Out of tree, we can obviously remove
# only what we were told to build.
ifndef O
PRISTINE_O = outdir
O = $(PROJECT_BASE)/outdir/$(BOARD)
else
PRISTINE_O = $(O)
endif
# Turn O into an absolute path; we call the main Kbuild with $(MAKE) -C
# which changes the working directory, relative paths don't work right.
# Need to create the directory first to make realpath happy
ifneq ($(MAKECMDGOALS),help)
$(shell mkdir -p $(O))
override O := $(realpath $(O))
endif
export ARCH QEMU_EXTRA_FLAGS PROJECT_BASE
override CONF_FILE := $(strip $(subst $(DQUOTE),,$(CONF_FILE)))
SOURCE_DIR ?= $(PROJECT_BASE)/src/
override SOURCE_DIR := $(realpath $(SOURCE_DIR))
override SOURCE_DIR := $(subst \,/,$(SOURCE_DIR))
override SOURCE_DIR_PARENT := $(patsubst %, %/.., $(SOURCE_DIR))
override SOURCE_DIR_PARENT := $(abspath $(SOURCE_DIR_PARENT))
override SOURCE_DIR_PARENT := $(subst \,/,$(SOURCE_DIR_PARENT))
export SOURCE_DIR SOURCE_DIR_PARENT
ifeq ("$(origin V)", "command line")
KBUILD_VERBOSE = $(V)
endif
ifndef KBUILD_VERBOSE
KBUILD_VERBOSE = 0
endif
ifeq ($(KBUILD_VERBOSE),1)
Q =
S =
else
Q = @
S = -s
endif
export CFLAGS
zephyrmake = +$(MAKE) -C $(ZEPHYR_BASE) O=$(1) \
PROJECT=$(PROJECT_BASE) SOURCE_DIR=$(DQUOTE)$(SOURCE_DIR)$(DQUOTE) $(2)
BOARDCONFIG = $(O)/.board_$(BOARD)
DOTCONFIG = $(O)/.config
all: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
debug: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
flash: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
run: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
ifeq ($(MAKECMDGOALS),debugserver)
ARCH = $(notdir $(subst /$(BOARD),,$(wildcard $(ZEPHYR_BASE)/boards/*/$(BOARD))))
BOARD_DIR = $(dir $(wildcard $(ZEPHYR_BASE)/boards/*/*/$(BOARD)_defconfig))
-include $(BOARD_DIR)/Makefile.board
-include $(ZEPHYR_BASE)/scripts/Makefile.toolchain.$(ZEPHYR_GCC_VARIANT)
BOARD_NAME = $(BOARD)
export BOARD_NAME
endif
debugserver: FORCE
$(Q)$(CONFIG_SHELL) $(ZEPHYR_BASE)/scripts/support/$(DEBUG_SCRIPT) debugserver
initconfig: $(DOTCONFIG)
$(BOARDCONFIG):
@rm -f $(O)/.board_*
@touch $@
ram_report: initconfig
$(Q)$(call zephyrmake,$(O),$@)
rom_report: initconfig
$(Q)$(call zephyrmake,$(O),$@)
outputexports: initconfig
$(Q)$(call zephyrmake,$(O),$@)
dts: initconfig
$(Q)$(call zephyrmake,$(O),$@)
config-sanitycheck: dts
$(Q)$(call zephyrmake,$(O),$@)
menuconfig: initconfig
$(Q)$(call zephyrmake,$(O),$@)
help:
$(Q)$(MAKE) -s -C $(ZEPHYR_BASE) $@
# Catch all
%:
$(Q)$(call zephyrmake,$(O),$@)
$(DOTCONFIG): $(BOARDCONFIG) $(KBUILD_DEFCONFIG_PATH) $(CONF_FILE)
$(Q)$(CONFIG_SHELL) $(ZEPHYR_BASE)/scripts/kconfig/merge_config.sh \
-q -m -O $(O) $(KBUILD_DEFCONFIG_PATH) $(OVERLAY_CONFIG) $(CONF_FILE) \
$(wildcard $(O)/*.conf)
$(Q)$(MAKE) $(S) -C $(ZEPHYR_BASE) O=$(O) PROJECT=$(PROJECT_BASE) oldnoconfig
pristine:
$(Q)rm -rf $(PRISTINE_O)
PHONY += FORCE initconfig
FORCE:
.PHONY: $(PHONY)

4
Makefile.test Normal file
View File

@@ -0,0 +1,4 @@
OVERLAY_CONFIG += $(ZEPHYR_BASE)/tests/include/test.config
include ${ZEPHYR_BASE}/Makefile.inc

View File

@@ -1,19 +1,10 @@
Zephyr Project
##############
.. raw:: html
<a href="https://www.zephyrproject.org">
<p align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="doc/_static/images/logo-readme-dark.svg">
<source media="(prefers-color-scheme: light)" srcset="doc/_static/images/logo-readme-light.svg">
<img src="doc/_static/images/logo-readme-light.svg">
</picture>
</p>
</a>
<a href="https://bestpractices.coreinfrastructure.org/projects/74"><img src="https://bestpractices.coreinfrastructure.org/projects/74/badge"></a>
<a href="https://scorecard.dev/viewer/?uri=github.com/zephyrproject-rtos/zephyr"><img src="https://api.securityscorecards.dev/projects/github.com/zephyrproject-rtos/zephyr/badge"></a>
<a href="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml?query=branch%3Amain"><img src="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml/badge.svg?event=push"></a>
<a href="https://bestpractices.coreinfrastructure.org/projects/74"><img
src="https://bestpractices.coreinfrastructure.org/projects/74/badge"></a>
The Zephyr Project is a scalable real-time operating system (RTOS) supporting
multiple hardware architectures, optimized for resource constrained devices,
@@ -23,87 +14,88 @@ The Zephyr OS is based on a small-footprint kernel designed for use on
resource-constrained systems: from simple embedded environmental sensors and
LED wearables to sophisticated smart watches and IoT wireless gateways.
The Zephyr kernel supports multiple architectures, including ARM (Cortex-A,
Cortex-R, Cortex-M), Intel x86, ARC, Nios II, Tensilica Xtensa, and RISC-V,
SPARC, MIPS, and a large number of `supported boards`_.
The Zephyr kernel supports multiple architectures, including ARM Cortex-M,
Intel x86, ARC, NIOS II, Tensilica Xtensa, and RISC V, and a large number of
`supported boards`_.
.. below included in doc/introduction/introduction.rst
Getting Started
***************
Welcome to Zephyr! See the `Introduction to Zephyr`_ for a high-level overview,
and the documentation's `Getting Started Guide`_ to start developing.
.. start_include_here
Community Support
*****************
Community support is provided via mailing lists and Discord; see the Resources
below for details.
The Zephyr Project Developer Community includes developers from member
organizations and the general community all joining in the development of
software within the Zephyr Project. Members contribute and discuss ideas,
submit bugs and bug fixes, and provide training. They also help those in need
through the community's forums such as mailing lists and IRC channels. Anyone
can join the developer community and the community is always willing to help
its members and the User Community to get the most out of the Zephyr Project.
.. _project-resources:
Welcome to the Zephyr community!
Resources
*********
Here's a quick summary of resources to help you find your way around:
Here's a quick summary of resources to find your way around the Zephyr Project
support systems:
Getting Started
---------------
* **Zephyr Project Website**: The https://zephyrproject.org website is the
central source of information about the Zephyr Project. On this site, you'll
find background and current information about the project as well as all the
relevant links to project material. For a quick start, refer to the
`Zephyr Introduction`_ and `Getting Started Guide`_.
| 📖 `Zephyr Documentation`_
| 🚀 `Getting Started Guide`_
| 🙋🏽 `Tips when asking for help`_
| 💻 `Code samples`_
* **Releases**: Source code for Zephyr kernel releases are available at
https://zephyrproject.org/downloads. On this page,
you'll find release information, and links to download or clone source
code from our GitHub repository. You'll also find links for the Zephyr
SDK, a moderated collection of tools and libraries used to develop your
applications.
Code and Development
--------------------
* **Source Code in GitHub**: Zephyr Project source code is maintained on a
public GitHub repository at https://github.com/zephyrproject-rtos/zephyr.
You'll find information about getting access to the repository and how to
contribute to the project in this `Contribution Guide`_ document.
| 🌐 `Source Code Repository`_
| 📦 `Releases`_
| 🤝 `Contribution Guide`_
* **Samples Code**: In addition to the kernel source code, there are also
many documented `Sample and Demo Code Examples`_ that can help show you
how to use Zephyr services and subsystems.
Community and Support
---------------------
* **Documentation**: Extensive Project technical documentation is developed
along with the Zephyr kernel itself, and can be found at
https://zephyrproject.org/doc. Additional documentation is maintained in
the `Zephyr GitHub wiki`_.
| 💬 `Discord Server`_ for real-time community discussions
| 📧 `User mailing list (users@lists.zephyrproject.org)`_
| 📧 `Developer mailing list (devel@lists.zephyrproject.org)`_
| 📬 `Other project mailing lists`_
| 📚 `Project Wiki`_
* **Issue Reporting and Tracking**: Requirements and Issue tracking is done in
our JIRA system: https://jira.zephyrproject.org. You can browse through the
reported issues and submit issues of your own.
Issue Tracking and Security
---------------------------
* **Security-related Issue Reporting**: For security-related inquiries or
reporting suspected security-related bugs in the Zephyr OS, please
send email to vulnerabilities@zephyrproject.org. We will assess and fix
flaws according to our security policy outlined in the Zephyr Project
`Security Overview`_.
| 🐛 `GitHub Issues`_
| 🔒 `Security documentation`_
| 🛡️ `Security Advisories Repository`_
| ⚠️ Report security vulnerabilities at vulnerabilities@zephyrproject.org
* **Mailing List**: The `Zephyr Mailing Lists`_ are perhaps the most convenient
way to track developer discussions and to ask your own support questions to
the Zephyr project community.
You can also read through message archives to follow
past posts and discussions, a good thing to do to discover more about the
Zephyr project.
Additional Resources
--------------------
| 🌐 `Zephyr Project Website`_
| 📺 `Zephyr Tech Talks`_
* **IRC Chatting**: You can chat online with the Zephyr project developer
community and other users in our IRC channel #zephyrproject on the
freenode.net IRC server. You can use the http://webchat.freenode.net web
client or use a client-side application such as pidgin.
.. _Zephyr Project Website: https://www.zephyrproject.org
.. _Discord Server: https://chat.zephyrproject.org
.. _supported boards: https://docs.zephyrproject.org/latest/boards/index.html
.. _Zephyr Documentation: https://docs.zephyrproject.org
.. _Introduction to Zephyr: https://docs.zephyrproject.org/latest/introduction/index.html
.. _Getting Started Guide: https://docs.zephyrproject.org/latest/develop/getting_started/index.html
.. _Contribution Guide: https://docs.zephyrproject.org/latest/contribute/index.html
.. _Source Code Repository: https://github.com/zephyrproject-rtos/zephyr
.. _GitHub Issues: https://github.com/zephyrproject-rtos/zephyr/issues
.. _Releases: https://github.com/zephyrproject-rtos/zephyr/releases
.. _Project Wiki: https://github.com/zephyrproject-rtos/zephyr/wiki
.. _User mailing list (users@lists.zephyrproject.org): https://lists.zephyrproject.org/g/users
.. _Developer mailing list (devel@lists.zephyrproject.org): https://lists.zephyrproject.org/g/devel
.. _Other project mailing lists: https://lists.zephyrproject.org/g/main/subgroups
.. _Code samples: https://docs.zephyrproject.org/latest/samples/index.html
.. _Security documentation: https://docs.zephyrproject.org/latest/security/index.html
.. _Security Advisories Repository: https://github.com/zephyrproject-rtos/zephyr/security
.. _Tips when asking for help: https://docs.zephyrproject.org/latest/develop/getting_started/index.html#asking-for-help
.. _Zephyr Tech Talks: https://www.zephyrproject.org/tech-talks
.. _supported boards: https://www.zephyrproject.org/doc/boards/boards.html
.. _Zephyr Introduction: https://www.zephyrproject.org/doc/introduction/introducing_zephyr.html
.. _Getting Started Guide: https://www.zephyrproject.org/doc/getting_started/getting_started.html
.. _Contribution Guide: https://www.zephyrproject.org/doc/contribute/contribute_guidelines.html
.. _Zephyr GitHub wiki: https://github.com/zephyrproject-rtos/zephyr/wiki
.. _Zephyr Mailing Lists: https://lists.zephyrproject.org/
.. _Sample and Demo Code Examples: https://www.zephyrproject.org/doc/samples/samples.html
.. _Security Overview: https://www.zephyrproject.org/doc/security/security-overview.html

View File

@@ -1 +0,0 @@
0.17.0

View File

@@ -1,5 +0,0 @@
VERSION_MAJOR = 4
VERSION_MINOR = 1
PATCHLEVEL = 0
VERSION_TWEAK = 0
EXTRAVERSION =

View File

@@ -1,14 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
# FIXME: SHADOW_VARS: Remove this once we have enabled -Wshadow globally.
add_compile_options($<TARGET_PROPERTY:compiler,warning_shadow_variables>)
add_definitions(-D__ZEPHYR_SUPERVISOR__)
include_directories(
${ZEPHYR_BASE}/kernel/include
${ARCH_DIR}/${ARCH}/include
)
add_subdirectory(common)
add_subdirectory(${ARCH_DIR}/${ARCH} arch/${ARCH})

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +0,0 @@
# Copyright (c) 2023 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Note: $ARCH might be a glob pattern
source "$(ARCH_DIR)/$(ARCH)/Kconfig"

View File

@@ -1,5 +0,0 @@
# Copyright (c) 2023 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
source "$(KCONFIG_BINARY_DIR)/arch/Kconfig"

1
arch/Makefile Normal file
View File

@@ -0,0 +1 @@
obj-y += common/ $(ARCH)/

View File

@@ -1,47 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
# Enable debug support in mdb
# Dwarf version 2 can be recognized by mdb
# The default dwarf version in gdb is not recognized by mdb
zephyr_cc_option(-g3 -gdwarf-2)
# Without this (poorly named) option, compiler may generate undefined
# references to abort().
# See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=63691
zephyr_cc_option(-fno-delete-null-pointer-checks)
zephyr_cc_option_ifdef(CONFIG_ARC_USE_UNALIGNED_MEM_ACCESS -munaligned-access)
if(NOT COMPILER STREQUAL arcmwdt)
if(CONFIG_THREAD_LOCAL_STORAGE)
# Instruct compiler to use proper register as cached thread pointer for thread local storage.
# For ARCv2 the default register is usually not specified - so we need to specify it
# For ARCv3 the register is fixed to r30, so we don't need to specify it
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=26)
else()
# If thread local storage isn't used - we can safely schedule thread pointer register
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=none)
endif()
endif()
add_subdirectory(core)
if(COMPILER STREQUAL arcmwdt)
add_subdirectory(arcmwdt)
if(CONFIG_64BIT)
zephyr_compile_options(-Ml)
# Instruct MWDT assembler not to warn when we load only lower half (32bit) of symbol
# instead of full 64bit address in ASM code. It is valid as we don't support Zephyr
# linkage to high addresses for 64bit ARC platforms.
zephyr_compile_options(-Wa,-offwarn=168)
endif()
endif()
if(CONFIG_ISA_ARCV3 AND CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf64-littlearc64)
elseif(CONFIG_ISA_ARCV3 AND NOT CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc64)
else()
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc)
endif()

6
arch/arc/Kbuild Normal file
View File

@@ -0,0 +1,6 @@
subdir-ccflags-y +=-I$(srctree)/include/drivers
subdir-ccflags-y +=-I$(srctree)/drivers
subdir-asflags-y += $(subdir-ccflags-y)
obj-y += soc/$(SOC_PATH)/
obj-y += core/

View File

@@ -1,7 +1,18 @@
# ARC options
# ARC EM4 options
# Copyright (c) 2014, 2019 Wind River Systems, Inc.
#
# Copyright (c) 2014 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
choice
prompt "ARC SoC Selection"
depends on ARC
source "arch/arc/soc/*/Kconfig.soc"
endchoice
menu "ARC Options"
depends on ARC
@@ -9,136 +20,56 @@ menu "ARC Options"
config ARCH
default "arc"
config CPU_ARCEM
config ARCH_DEFCONFIG
string
default "arch/arc/defconfig"
config CPU_HAS_MPU
bool
# Omit prompt to signify "hidden" option
default n
help
This option is enabled when the CPU has a Memory Protection Unit (MPU).
config CPU_HAS_FPU
# Hidden config selected by CPU family
bool
default n
help
This option is enabled when the CPU has hardware floating point
unit.
menu "ARC EM4 processor options"
config CPU_ARCEM4
bool
default y
select CPU_ARCV2
select ATOMIC_OPERATIONS_C
help
This option signifies the use of an ARC EM CPU
This option signifies the use of an ARC EM4 CPU
config CPU_ARCHS
endmenu
menu "ARCv2 Family Options"
config CPU_ARCV2
bool
select ATOMIC_OPERATIONS_BUILTIN
select BARRIER_OPERATIONS_BUILTIN
help
This option signifies the use of an ARC HS CPU
choice
prompt "ARC Instruction Set"
default ISA_ARCV2
config ISA_ARCV2
bool "ARC ISA v2"
select ARCH_HAS_STACK_PROTECTION if ARC_HAS_STACK_CHECKING || (ARC_MPU && ARC_MPU_VER !=2)
select ARCH_HAS_USERSPACE if ARC_MPU
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
help
v2 ISA for the ARC-HS & ARC-EM cores
config ISA_ARCV3
bool "ARC ISA v3"
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
endchoice
if ISA_ARCV2
config CPU_EM4
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 CPU
config CPU_EM4_DMIPS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU
config CPU_EM4_FPUS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with the single-precision
floating-point extension
config CPU_EM4_FPUDA
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with single-precision
floating-point and double assist instructions
config CPU_EM6
bool
select CPU_ARCEM
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC EM6 CPU
config CPU_HS3X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS3x CPU
config CPU_HS4X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an HS4X CPU
endif #ISA_ARCV2
if ISA_ARCV3
config CPU_HS5X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
config CPU_HS6X
bool
select CPU_ARCHS
select 64BIT
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
endif #ISA_ARCV3
config FP_FPU_DA
bool
menu "ARC CPU Options"
config ARC_HAS_ZOL
bool
depends on ISA_ARCV2
default y
help
ARCv2 CPUs have ZOL hardware loop mechanism which the ARCv3 ISA drops.
Architecturally ZOL provides
- LPcc instruction
- LP_COUNT core reg
- LP_START, LP_END aux regs
Disabling this option removes usage of ZOL regs from code
This option signifies the use of a CPU of the ARCv2 family.
config NUM_IRQ_PRIO_LEVELS
int "Number of supported interrupt priority levels"
config DATA_ENDIANNESS_LITTLE
bool
default y
help
This is driven by the processor implementation, since it is fixed in
hardware. The BSP should set this value to 'n' if the data is
implemented as big endian.
config NUM_IRQ_PRIO_LEVELS
int
prompt "Number of supported interrupt priority levels"
range 1 16
help
Interrupt priorities available will be 0 to NUM_IRQ_PRIO_LEVELS-1.
@@ -146,8 +77,9 @@ config NUM_IRQ_PRIO_LEVELS
The BSP must provide a valid default for proper operation.
config NUM_IRQS
int "Upper limit of interrupt numbers/IDs used"
config NUM_IRQS
int
prompt "Upper limit of interrupt numbers/IDs used"
range 17 256
help
Interrupts available will be 0 to NUM_IRQS-1.
@@ -157,113 +89,56 @@ config NUM_IRQS
The BSP must provide a valid default. This drives the size of the
vector table.
config RGF_NUM_BANKS
int "Number of General Purpose Register Banks"
depends on ARC_FIRQ
depends on NUM_IRQ_PRIO_LEVELS > 1
config RGF_NUM_BANKS
int
prompt "Number of General Purpose Register Banks"
depends on CPU_ARCV2
range 1 2
default 2
help
The ARC CPU can be configured to have more than one register
bank. If fast interrupts are supported (FIRQ), the 2nd
register bank, in the set, will be used by FIRQ interrupts.
If fast interrupts are supported but there is only 1
register bank, the fast interrupt handler must save
and restore general purpose registers.
NOTE: it's required to have more than one interrupt priority level
to use second register bank - otherwise all interrupts will use
same register bank. Such configuration isn't supported in software
and it is not beneficial from the performance point of view.
The ARC CPU can be configured to have more than one register
bank. If fast interrupts are supported (FIRQ), the 2nd
register bank, in the set, will be used by FIRQ interrupts.
If fast interrupts are supported but there is only 1
register bank, the fast interrupt handler must save
and restore general purpose registers.
config ARC_FIRQ
bool "FIRQ enable"
depends on ISA_ARCV2
depends on NUM_IRQ_PRIO_LEVELS > 1
depends on !ARC_HAS_SECURE
default y
help
Fast interrupts are supported (FIRQ). If FIRQ enabled, for interrupts
with highest priority, status32 and pc will be saved in aux regs,
other regs will be saved according to the number of register bank;
If FIRQ is disabled, the handle of interrupts with highest priority
will be same with other interrupts.
NOTE: we don't allow the configuration with FIRQ enabled and only one
interrupt priority level (so all interrupts are FIRQ). Such
configuration isn't supported in software and it is not beneficial
from the performance point of view.
config ARC_FIRQ_STACK
bool "Separate firq stack"
depends on ARC_FIRQ && RGF_NUM_BANKS > 1
help
Use separate stack for FIRQ handing. When the fast irq is also a direct
irq, this will get the minimal interrupt latency.
config ARC_FIRQ_STACK_SIZE
int "FIRQ stack size"
depends on ARC_FIRQ_STACK
default 1024
help
The size of firq stack.
config ARC_HAS_STACK_CHECKING
bool "ARC has STACK_CHECKING"
depends on ISA_ARCV2
default y
help
ARC is configured with STACK_CHECKING which is a mechanism for
checking stack accesses and raising an exception when a stack
overflow or underflow is detected.
config ARC_CONNECT
bool "ARC has ARC connect"
select SCHED_IPI_SUPPORTED
help
ARC is configured with ARC CONNECT which is a hardware for connecting
multi cores.
config ARC_STACK_CHECKING
bool
select NO_UNUSED_STACK_INSPECTION
help
Use ARC STACK_CHECKING to do stack protection
config ARC_STACK_PROTECTION
bool
default y if HW_STACK_PROTECTION
select ARC_STACK_CHECKING if ARC_HAS_STACK_CHECKING
select MPU_STACK_GUARD if (!ARC_STACK_CHECKING && ARC_MPU && ARC_MPU_VER !=2)
config ARC_STACK_CHECKING
bool "Enable Stack Checking"
depends on CPU_ARCV2
select THREAD_STACK_INFO
default n
help
This option enables either:
- The ARC stack checking, or
- the MPU-based stack guard
to cause a system fatal error
if the bounds of the current process stack are overflowed.
The two stack guard options are mutually exclusive. The
selection of the ARC stack checking is
prioritized over the MPU-based stack guard.
ARCV2 has a special feature allowing to check stack overflows. This
enables code that allows using this debug feature
config ARC_USE_UNALIGNED_MEM_ACCESS
bool "Unaligned access in HW"
default y if CPU_ARCHS
depends on (CPU_ARCEM && !ARC_HAS_SECURE) || CPU_ARCHS
config FAULT_DUMP
int
prompt "Fault dump level"
default 2
range 0 2
help
ARC EM cores w/o secure shield 2+2 mode support might be configured
to support unaligned memory access which is then disabled by default.
Enable unaligned access in hardware and make software to use it.
Different levels for display information when a fault occurs.
config ARC_CURRENT_THREAD_USE_NO_TLS
bool
select CURRENT_THREAD_USE_NO_TLS
default y if (RGF_NUM_BANKS > 1) || ("$(ZEPHYR_TOOLCHAIN_VARIANT)" = "arcmwdt")
2: The default. Display specific and verbose information. Consumes
the most memory (long strings).
1: Display general and short information. Consumes less memory
(short strings).
0: Off.
config IRQ_OFFLOAD
bool "Enable IRQ offload"
default n
help
Disable current Thread Local Storage for ARC. For cores with more then one
RGF_NUM_BANKS the parameter is disabled by-default because banks syncronization
requires significant time, and it slows down performance.
ARCMWDT works with tls pointer in different way then GCC. Optimized access to
TLS pointer via the _current symbol does not provide significant advantages
in case of MetaWare.
Enable irq_offload() API which allows functions to be synchronously
run in interrupt context. Uses one entry in the IDT. Mainly useful
for test cases.
config XIP
default n if NSIM
default y
config GEN_ISR_TABLES
default y
@@ -272,94 +147,53 @@ config GEN_IRQ_START_VECTOR
default 16
config HARVARD
bool "Harvard Architecture"
prompt "Harvard Architecture"
bool
default n
help
The ARC CPU can be configured to have two busses;
one for instruction fetching and another that serves as a data bus.
The ARC CPU can be configured to have two busses;
one for instruction fetching and another that serves as a data bus.
config CODE_DENSITY
bool "Code Density Option"
prompt "Code Density Option"
bool
default n
help
Enable code density option to get better code density
Enable code density option to get better code density
config ARC_HAS_ACCL_REGS
bool "Reg Pair ACCL:ACCH (FPU and/or MPY > 6 and/or DSP)"
default y if CPU_HS3X || CPU_HS4X || CPU_HS5X || CPU_HS6X
menu "Floating Point Options"
depends on CPU_HAS_FPU
config FLOAT
bool
prompt "Floating point registers"
default n
help
Depending on the configuration, CPU can contain accumulator reg-pair
(also referred to as r58:r59). These can also be used by gcc as GPR so
kernel needs to save/restore per process
This option allows tasks and fibers to use the floating point registers.
By default, only a single task or fiber may use the registers.
config ARC_HAS_SECURE
bool "ARC has SecureShield"
depends on ISA_ARCV2
select CPU_HAS_TEE
select ARCH_HAS_TRUSTED_EXECUTION
Disabling this option means that any task or fiber that uses a
floating point register will get a fatal exception.
config FP_SHARING
bool
prompt "Floating point register sharing"
depends on FLOAT
default n
help
This option is enabled when ARC core supports secure mode
This option allows multiple tasks and fibers to use the floating point
registers.
config SJLI_TABLE_SIZE
int "SJLI table size"
depends on ARC_SECURE_FIRMWARE
default 8
help
The size of sjli (Secure Jump and Link Indexed) table. The
code in normal mode call secure services in secure mode through
sjli instruction.
config ARC_SECURE_FIRMWARE
bool "Generate Secure Firmware"
depends on ARC_HAS_SECURE
default y if TRUSTED_EXECUTION_SECURE
help
This option indicates that we are building a Zephyr image that
is intended to execute in secure mode. The option is only
applicable to ARC processors that implement the SecureShield.
This option enables Zephyr to include code that executes in
secure mode, as well as to exclude code that is designed to
execute only in normal mode.
Code executing in secure mode has access to both the secure
and normal resources of the ARC processors.
config ARC_NORMAL_FIRMWARE
bool "Generate Normal Firmware"
depends on !ARC_SECURE_FIRMWARE
depends on ARC_HAS_SECURE
default y if TRUSTED_EXECUTION_NONSECURE
help
This option indicates that we are building a Zephyr image that
is intended to execute in normal mode. Execution of this
image is triggered by secure firmware that executes in secure
mode. The option is only applicable to ARC processors that
implement the SecureShield.
This option enables Zephyr to include code that executes in
normal mode only, as well as to exclude code that is
designed to execute only in secure mode.
Code executing in normal mode has no access to secure
resources of the ARC processors, and, therefore, it shall avoid
accessing them.
config ARC_VPX_COOPERATIVE_SHARING
bool "Cooperative sharing of ARC VPX vector registers"
select SCHED_CPU_MASK if MP_MAX_NUM_CPUS > 1
help
This option enables the cooperative sharing of the ARC VPX vector
registers. Threads that want to use those registers must successfully
call arc_vpx_lock() before using them, and call arc_vpx_unlock()
when done using them.
source "arch/arc/core/dsp/Kconfig"
endmenu
menu "ARC MPU Options"
depends on CPU_HAS_MPU
config ARC_MPU_ENABLE
bool "Memory Protection Unit (MPU)"
bool "Enable MPU"
depends on CPU_HAS_MPU
select ARC_MPU
default n
help
Enable MPU
@@ -367,53 +201,99 @@ source "arch/arc/core/mpu/Kconfig"
endmenu
config DCACHE_LINE_SIZE
config ICCM_SIZE
int "ICCM Size in kB"
help
This option specifies the size of the ICCM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config ICCM_BASE_ADDRESS
hex "ICCM Base Address"
help
This option specifies the base address of the ICCM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config DCCM_SIZE
int "DCCM Size in kB"
help
This option specifies the size of the DCCM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config DCCM_BASE_ADDRESS
hex "DCCM Base Address"
help
This option specifies the base address of the DCCM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config SRAM_SIZE
int "SRAM Size in kB"
help
This option specifies the size of the SRAM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config SRAM_BASE_ADDRESS
hex "SRAM Base Address"
help
This option specifies the base address of the SRAM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config FLASH_SIZE
int "Flash Size in kB"
help
This option specifies the size of the flash in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config FLASH_BASE_ADDRESS
hex "Flash Base Address"
help
This option specifies the base address of the flash on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config CACHE_LINE_SIZE_DETECT
bool
prompt "Detect d-cache line size at runtime"
default n
help
This option enables querying the d-cache build register for finding
the d-cache line size at the expense of taking more memory and code
and a slightly increased boot time.
If the CPU's d-cache line size is known in advance, disable this
option and manually enter the value for CACHE_LINE_SIZE.
config CACHE_LINE_SIZE
int
prompt "Cache line size" if !CACHE_LINE_SIZE_DETECT
default 32
config ARC_EXCEPTION_STACK_SIZE
int "ARC exception handling stack size"
default 768 if !64BIT
default 2048 if 64BIT
help
Size in bytes of exception handling stack which is at the top of
interrupt stack to get smaller memory footprint because exception
is not frequent. To reduce the impact on interrupt handling,
especially nested interrupt, it cannot be too large.
Size in bytes of a CPU d-cache line.
Detect automatically at runtime by selecting CACHE_LINE_SIZE_DETECT.
config ARCH_CACHE_FLUSH_DETECT
bool
default n
config CACHE_FLUSHING
bool
default n
prompt "Enable d-cache flushing mechanism"
help
This links in the sys_cache_flush() function, which provides a
way to flush multiple lines of the d-cache.
If the d-cache is present, set this to y.
If the d-cache is NOT present, set this to n.
endmenu
config ARC_EARLY_SOC_INIT
bool "Make early stage SoC-specific initialization"
help
Call SoC per-core setup code on early stage initialization
(before C runtime initialization). Setup code is called in form of
soc_early_asm_init_percpu assembler macro.
config MAIN_STACK_SIZE
default 4096 if 64BIT
config ISR_STACK_SIZE
default 4096 if 64BIT
config SYSTEM_WORKQUEUE_STACK_SIZE
default 4096 if 64BIT
config IDLE_STACK_SIZE
default 1024 if 64BIT
config IPM_CONSOLE_STACK_SIZE
default 2048 if 64BIT
config TEST_EXTRA_STACK_SIZE
default 2048 if 64BIT
config CMSIS_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_DYNAMIC_STACK_SIZE
default 2048 if 64BIT
source "arch/arc/soc/*/Kconfig"
endmenu

24
arch/arc/Makefile Normal file
View File

@@ -0,0 +1,24 @@
# Enable debug support in mdb
# Dwarf version 2 can be recognized by mdb
# The default dwarf version in gdb is not recognized by mdb
cflags-y += $(call cc-option, -g3 -gdwarf-2)
cflags-y += $(call cc-option,-ffunction-sections,) $(call cc-option,-fdata-sections,)
# Without this (poorly named) option, compiler may generate undefined
# references to abort().
# See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=63691
cflags-y += $(call cc-option,-fno-delete-null-pointer-checks)
cflags-$(CONFIG_LTO) = $(call cc-option,-flto,)
include $(srctree)/arch/$(ARCH)/soc/$(SOC_PATH)/Makefile
KBUILD_CFLAGS += $(cflags-y)
KBUILD_CXXFLAGS += $(cflags-y)
soc-cxxflags ?= $(soc-cflags)
soc-aflags ?= $(soc-cflags)
KBUILD_CFLAGS += $(soc-cflags)
KBUILD_CXXFLAGS += $(soc-cxxflags)
KBUILD_AFLAGS += $(soc-aflags)

View File

@@ -1,5 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
if(CONFIG_ARCMWDT_LIBC OR CONFIG_CPP)
zephyr_sources(arcmwdt-dtr-stubs.c)
endif()

View File

@@ -1,22 +0,0 @@
/*
* Copyright (c) 2021 Synopsys.
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <zephyr/toolchain.h>
__weak void *__dso_handle;
int __cxa_atexit(void (*destructor)(void *), void *objptr, void *dso)
{
ARG_UNUSED(destructor);
ARG_UNUSED(objptr);
ARG_UNUSED(dso);
return 0;
}
int atexit(void (*function)(void))
{
return 0;
}

View File

@@ -1,38 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
zephyr_library()
zephyr_library_sources(
thread.c
thread_entry_wrapper.S
cpu_idle.S
fatal.c
fault.c
fault_s.S
irq_manage.c
timestamp.c
isr_wrapper.S
regular_irq.S
switch.S
prep_c.c
reset.S
vector_table.c
)
zephyr_library_sources_ifdef(CONFIG_ARCH_CACHE cache.c)
zephyr_library_sources_ifdef(CONFIG_ARC_FIRQ fast_irq.S)
zephyr_library_sources_ifdef(CONFIG_IRQ_OFFLOAD irq_offload.c)
zephyr_library_sources_ifdef(CONFIG_USERSPACE userspace.S)
zephyr_library_sources_ifdef(CONFIG_ARC_CONNECT arc_connect.c)
zephyr_library_sources_ifdef(CONFIG_ARC_CONNECT smp.c)
zephyr_library_sources_ifdef(CONFIG_THREAD_LOCAL_STORAGE tls.c)
add_subdirectory_ifdef(CONFIG_ARC_CORE_MPU mpu)
add_subdirectory_ifdef(CONFIG_ARC_SECURE_FIRMWARE secureshield)
zephyr_linker_sources(ROM_START SORT_KEY 0x0vectors vector_table.ld)
zephyr_library_sources_ifdef(CONFIG_LLEXT elf.c)

Some files were not shown because too many files have changed in this diff Show More