Compare commits

..

107 Commits

Author SHA1 Message Date
Anas Nashif
8f0b4d7f4d Zephyr 1.6-rc2
Change-Id: I930d11d2e41af3b77513531b8944ba77b5c5b278
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-19 13:20:39 -05:00
Benjamin Walsh
796a6bb4d8 samples: configure philosophers with more than 32 priorities
To have one project use more than 32 priorities. The preempt priorities
are also aligned so that they straddle two priority bitmaps.

Change-Id: I0f0862110d876e40fde45a0d105b769e8603d644
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-18 18:46:15 -05:00
Benjamin Walsh
385e02ba52 kernel: support for more than 32 total priorities
In addition to more priorities taking more memory to host them, finding
the next thread to run when it is not cached is slower since each extra
set of 32 priorities maps to a loop iteration. That loop is remove
entirely when the number of priorities is less than 32 (31 + the idle
thread).

Fixes ZEP-1303.

Change-Id: I3205df90d379a0f4456ff1d7f1aaa67ad2cddf15
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-18 18:46:15 -05:00
Juan Manuel Cruz
5a5d878252 win-build: Fixes a kconfig incompatibility for Windows
In windows systems the rename() function fails if the new name
of the original file corresponds to a file that already exists.

The fix removes the new file before renaming the original one.

Jira: ZEP-980

Change-Id: Ib3a43db86c0dd3fabb592f53ea7619eb5738bb65
Signed-off-by: Juan Manuel Cruz <juan.m.cruz.alcaraz@intel.com>
2016-11-18 18:43:41 -05:00
Juan Manuel Cruz
cf6e5cf730 enc28j60: Fixes an issue reading/writing long frames from SPI
Jira: ZEP-1302
Change-Id: Ia58d51aee14281aaeb2f8e85fbbf8c250eae8e06
Signed-off-by: Juan Manuel Cruz <juan.m.cruz.alcaraz@intel.com>
2016-11-18 18:43:41 -05:00
Allan Stephens
7f1d5a47e4 kernel: Minor optimization to kernel event logger timestamping
Rewrites the timestamping logic to always generate timestamps
via a function pointer that is initialized to sys_cycle_get_32(),
but can be changed to point to a user-supplied function. This
eliminates the need for an if/then/else construct in every place
that a timestamp is generated.

Change-Id: Id11f8c41b193a93cece16565978a525056010f0e
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-18 18:43:41 -05:00
Allan Stephens
aca2baa43a doc: Revise kernel event logger documentation
Makes the purpose and capabilities of the kernel event logger
clearer, and leaves much of the low-level detail relating to
use of the configuration options and APIs to the configuration
option guide and API guide, respectively. Also corrects some
bugs in the example code for retrieving event information.

Also updates the API guide to make a clear distinction between
the general purpose event logger framework and the kernel event
logger (which is a specific instance of this framework).

Change-Id: I924f65092b2b0e5050af13376b5da85a6cdc1a65
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-18 18:43:40 -05:00
Allan Stephens
bf77e2616d doc: Fix up API descriptions for kernel event logger
Prepares the kernel event logger APIs for inclusion in the
API guide. Also corrects a couple of other issues:

* Gets rid of obsolete thread monitor code.
* Renames "timer_func" global variable to "_sys_k_timer_func"
  to align it with kernel naming conventions.

Change-Id: I93d403f83ae44ff45dda489c2ead7bfec6ce1fa3
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-18 18:43:40 -05:00
Allan Stephens
cfa7ad5c4a kernel: Ensure event logger APIs convert timeouts to millseconds
Event logger APIs still express timeout delays in ticks;
need to convert to milliseconds when using unified kernel APIs.

Change-Id: I5fab66be660621cd2029417eaff3758e3ef4ba2c
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-18 18:43:40 -05:00
Juan Manuel Cruz
30414fd866 sensor: fixes program hangs in the apds9960 sample
If an i2c transaction fails the sample will hang the program into
an infinite circle.

This commit will remove the infinite circle and report back the
error code from the i2c transaction.

Change-Id: I38d350a805af6bec43f2fa8d4af6ce4e3cc27662
Coverity-CID: 151991
Coverity-CID: 151992
Signed-off-by: Juan Manuel Cruz <juan.m.cruz.alcaraz@intel.com>
2016-11-18 18:43:40 -05:00
Juan Manuel Cruz
7d21b5402c sensor: fixes dead code in the apds9960 sample
If the gpio or spi devices are not found there is no
need to keep the device busy in a loop for this particular
sample.

Since it is not possible to continue execution it is better
to simply end the application.

Change-Id: Ie25ea970a479db2a2f339ca2b37f88541a45ef97
Coverity-CID: 151973
Signed-off-by: Juan Manuel Cruz <juan.m.cruz.alcaraz@intel.com>
2016-11-18 18:43:40 -05:00
Vinicius Costa Gomes
3ed3f29223 iot/zoap: Fix decoding of 16-bit delta
When an option code or length representation is encoded in a 16-bit
value, the access was wrong.

Coverity-CID: 151963

Change-Id: Ie7741998cbde348ccf490a6686e68a1ace99920e
Signed-off-by: Vinicius Costa Gomes <vinicius.gomes@intel.com>
2016-11-18 18:43:40 -05:00
Anas Nashif
e9a4431362 tests: test CONFIG_KERNEL_DEBUG and CONFIG_ASSERT
Enable this option to test any usage of structs and variables inside
macros.

Change-Id: I6ec64fb865e87fc0771ae10f0c4eb63f6144c88a
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-18 18:43:40 -05:00
Benjamin Walsh
56816bdbbf kernel: fix obsolete access to fields in K_DEBUG() calls
When moving arch-specific thread structure to arch-agnostic, some field
accesses were missed when used in K_DEBUG statements, which are turned
off by default.

Change-Id: Ife0f49b8185a0db468deab73555f7034f20ca3e8
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-18 18:43:40 -05:00
Benjamin Walsh
420594122a kernel: fix thread prio and stack size types in some APIs
Prio should be an int, since values are small integers, not a fixed-size
int32_t. It aligns with the prio parameters of the other APIs.

Stack size should be size_t.

Change-Id: Id29751b86c4ad7a7c2a7ffe446c2a96ae83c77bf
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
0efe69ec31 test_static_idt: fix unininitialized variable
The divide-by-zero test was using an uninitialized variable that
Coverity was unhappy about. Simple fix to just initialize to any non
zero value.

Change-Id: I9e5865a99e7a8eb3ee52421cc3dcb6717dca1ad1
Coverity-ID: 152053
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
8173b881ba tests/legacy/kernel/test_libs: use memcpy() vs strncpy()
Coverity complained about the use of strncpy() to fill up a buffer of
size N with a string of the same size didn't leave room for the final
\0.

This is a valid concern; however, the usage is valid too, as the
writer intended to create a pattern that later can be tested--addind a
\0 would break the pattern.

So instead, use memcpy() for the same function.

Change-Id: If52d02ce41731348f4a2d750c79f9e1c51f3afcf
Coverity-ID: 151947
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
e7c091bba2 scrips/kconfig: use snprintf() vs sprintf()
Coverity reported 150819 issue, which steams off Flex generated code
from zconf.l in which sprintf() was use. Because of that, the
conf_read_simple() @name parameter could be used to overrun
zconf_open() @fullname by crafting SRCTREE and KCONFIG_ALLCONFIG
environment variables.

Change-Id: I2cff817dccafe0e06b35636bbb7be95e062410af
Coverity-ID: 150819
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
4c11ae8e60 test_fp_saring/nanokernel: fix uninitialized variable
Coverity complains about using an uninitialized variable; there is no
reason to do so; thus fixed to avoid maintaining a whitelist.

Change-Id: I657f9e7d46b1b9b091e36638c1951b93903fbec3
Coverity-ID: 152048
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
af941d5832 test_map: initialize memory block
This is not a strictly necessary initialization, as the data is not
used, but will keep Coverity happy. It being a testcase, there are no
size or speed penalties to the overall kernel and avoids having to
manage a whitelist for an issue in scanning tools.

Change-Id: I0ddcf43ca1114356d58f93de57232864246ffe07
Coverity-ID: 152052
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
32de7848c4 test_map: fix uninitialized area
Coverity complained about the code using an uninitialized chunk of
memory; harmless, but fixed to avoid having to whitelist.

Change-Id: I5c890ff78fab2799b882b8e4a25c15476702d132
Coverity-ID: 152049
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
afe1621713 test_static_idt: fix uninitialized variable
Coverity complains about an easy-to-fix uninitialized variable.

Change-Id: I04bf670c7137df25165d4e37f2f7df2d4004c478
Coverity-ID: 152050
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
42b956050a test_fp_sharing: fix uninitialized variable
Coverity complains about this (harmless) issue, so simple fix.

Change-Id: Ibac952157cb0541dbd150d681515280091409864
Coverity-ID: 152051
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Inaky Perez-Gonzalez
04af679197 samples/kernel_event_logger: initialize variable
Fix usage of an uninitialized variable detected by Coverity.

In theory GCC should pick up this situation, but it does not. I've
experimented with adding -Wextra and -Wuninitialized but I cannot get
GCC to complain. I might be missing something else, but in the
meantime, this is a simple fix to remove this issue.

Change-Id: I6fec37719719dfaf7077ce1f464605c93efa8ea2
Coverity-ID: 152054
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
2016-11-18 18:43:40 -05:00
Anas Nashif
b082e12265 kernel: remove v2 usage and rename KERNEL_V2_DEBUG
Change-Id: I6b3f07714322ad79aeec2342621a4cddfe84cb2c
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-18 18:43:40 -05:00
Paul Sokolovsky
3cbabecfcc pinmux: Make default init priority be between GPIO's prio and device prio.
Pinmux driver almost certainly should be initialized before the
rest of hardware devices (which may need specific pins already
configured for them), and usually after generic GPIO drivers.
Thus, its priority should be between KERNEL_INIT_PRIORITY_DEFAULT
(default 40) and KERNEL_INIT_PRIORITY_DEVICE (default 50). Thus,
we set PINMUX_INIT_PRIORITY to 45.

There are exceptions to the rule above for particular boards. For
example, BOARD=galileo has GPIO and pinmuxer on I2C bus and thus
overrides PINMUX_INIT_PRIORITY to be much higher. Note that while
PINMUX_INIT_PRIORITY was defined previously (at 60), it was used
only for galileo, which overrides it anyway.

This fix was prompted by investigation why eth_ksdk driver was
non-functional after kernel priorities re-hashing: both eth_ksdk
and pinmux used the same priority, and eth_ksdk happened to run
before pinmux. While bumping eth_ksdk priority would help in the
particular case, the same would likely reoccur with other drivers
like I2C, SPI, etc.

Change-Id: Ie5ca3135c1ee2fe8d9cf48d5c12e62eac63487f7
Signed-off-by: Paul Sokolovsky <paul.sokolovsky@linaro.org>
2016-11-18 18:40:17 -05:00
Inaky Perez-Gonzalez
4d2ad79207 legacy/kernel/test_{static_id,stackprot}: 'fatal fault' is not a failure
By default,  when a  'fatal fault'  message is seen in the output of any
testcase,  it is consider an inmediate fatal condition and the test case
is aborted.

However,  in all such cases,  the testcase is provoking the situation to
verify the condition is caught.  In this case it shall NOT be considered
a fatal fault and the default overriden to allow it to proceed.

Change-Id: Id4e9138e5f0fcb8cd77efbb1831897fb0946ba20
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
(cherry picked from commit da5f3a5c89)
2016-11-18 19:42:58 +00:00
Allan Stephens
0cddc4b665 doc: Minor cosmetic tweaks for kernel API descriptions
Change-Id: Ie989b45b19e5e70958301dd8d903cf2876709f5a
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
4cdbfbf9e2 doc: Add descriptions for clock-related helper macros
Also fixes up Kernel Primer examples to use these macros.

Change-Id: Ib1bc9e3f85ab75f81986bc3930fb287266a886b5
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
c38888bccb doc: Revise timer example to use workqueue instead of alert
Rewrites the example of a timer's expiry routine offloading
processing that can't be done at interrupt level. The example
now submits work to the system workqueue directly, rather than
using an alert. This saves footprint by eliminating the need
for alert-related API support that isn't needed. (This is a
true savings, since the alert code just called the same
workqueue APIs the example now calls directly.)

Change-Id: I378e40aef33014f2c75c4f57531f75247d50e479
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
be0db01093 doc: Fix up API descriptions for ring buffers
Change-Id: I82453c1fb5365d7dfe35cb1bc9eba50c71a47b17
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
a9cd7c0498 doc: Fix up API description for IRQ_CONNECT()
Change-Id: I5ea1bd28f355d78c724948568c160ef1b32b5eb5
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
6af438c440 doc: Fix up return value descriptions for kernel APIs
Return value descriptions using the "@retval" tag now reflect
the fact that they appear on a separate line from the value
they are describing.

Change-Id: I3e3e347d133ad998e7db50a99369d41cbfb9efcc
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
0ae966be58 doc: Improve descriptions of workqueue APIs
The API guide now does a better job of explaining how to use
a workqueue. Also hides information about workqueue internals
and fixes several errors and omissions.

Change-Id: I6492c1c6105c258ce98365ca33059d8f32c1be41
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
33d0716c21 doc: Improve descriptions for some user-supplied functions
The API guide now does a better job of explaining how to correctly
write these functions.

Change-Id: Ib1df55eb28fa408f3f786f122353e37505002f07
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:10 -05:00
Allan Stephens
a176dd3274 doc: Enable Kernel Primer links to macro-type APIs
Also adds a link to function-type API that was missing.

Change-Id: Ie671ad2f239cdca3ac1a2eb33248dfecfa251c79
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-17 21:32:09 -05:00
Baohong Liu
ed8c6e2d1f samples: grove_lcd: stop the app if device binding fails
Proceed to LCD programming only if device binding succeeds.
Otherwise, dereferencing a NULL pointer will happen. This
was caught by Coverity.

Coverity-CID: 151986

Change-Id: Ibdb658f530203428aa3e53f358e0788fc1502b06
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:13 -05:00
Javier B Perez
c280ce6bf9 test: power states: fix dead code issue
Coverity detected some constant value in the vars, due to the
exclusive config select in the code.

Change-Id: Id27b658f3cd70dce626fef054457a9c726b3b957
CID: 151974, 151972 and 151971
Signed-off-by: Javier B Perez <javier.b.perez.hernandez@intel.com>
2016-11-17 21:17:13 -05:00
Sergio Rodriguez
67d49c2344 drivers: gpio_dw: Remove contradictory if statement evaluation
This fixes an always false evaluation of the gpio I/O direction

This issue was reported by Coverity

Coverity-CID: 151978

Change-Id: I93ec3319a3f18d564c961a5cbd9dcc9c60efbeb7
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 21:17:13 -05:00
Sergio Rodriguez
f3c2664e53 drivers: gpio_atmel: Fix erronous if statement
The GPIO_INT_ACTIVE_LOW value is  zero so the mask assignement is
never executed. Using the bit complement GPIO_INT_ACTIVE_HIGH the
proper mask is assigned

This issue was reported by Coverity

Coverity-CID: 151966

Change-Id: Ibc7d2e4c3ebee249b5ab9719f8177cc14c0d1d33
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 21:17:13 -05:00
Sergio Rodriguez
22bbdc2f85 soc: stm32f1: gpio: Fix unnecessary else statement
The bitfield determining the I/O direction already defines the pin
as either input or output, cannot be none or both at the same time

This issue was reported by Coverity

Coverity-CID: 151970

Change-Id: I18d5387139d6834004ba3269c5b54176bdc97ea7
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 21:17:13 -05:00
Baohong Liu
3b626c5e15 samples: button: stop the app if device binding fails
Stop the app from running if device binding fails. Otherwise,
dereferencing NULL pointer will happen. This was caught by
Coverity.

Coverity-CID: 151988

Change-Id: I8245d938498a51123249fbd069935900ad660314
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:13 -05:00
Flavio Santes
794a47dedf tests/tinycrypt: Fix wrong sizeof argument in test_ccm_mode (2nd)
This commit fixes an issue in the test_ccm_mode.c file:

sizeof(data) is used to compute the length of the array pointed to
by the 'uint8_t *data' pointer.

At the same function scope, there is a variable (dlen) that already
specifies the required length, so we use that variable instead of
the 'sizeof' function call.

This issue was not reported by Coverity, although is worth to fix it.

Change-Id: I27cbf8c7000a4189a42d193f6445996d4b852aa6
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 21:17:13 -05:00
Flavio Santes
e203a64000 tests/tinycrypt: Fix wrong sizeof argument in test_ccm_mode
This commit fixes the wrong sizeof argument error reported by
Coverity.

Coverity-CID: 152032

Change-Id: I2ee3089b4b840f4a1b8ba0303e92a3311c07ffeb
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 21:17:13 -05:00
Flavio Santes
25a40b19ea tests/tinycrypt: Fix dead code issue (2nd)
This commit fixes the dead code issue reported by Coverity.

Coverity-CID: 151977

Change-Id: Iaa31c032456f48e1af1d1c9d722f051ac5519ccf
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 21:17:13 -05:00
Flavio Santes
9366bc161b tests/tinycrypt: Fix dead code issue (1st)
This commit fixes the dead code issue reported by Coverity.

Coverity-CID: 151975

Change-Id: I449341d1f540abe149e8ad9197a64d52cd5722cd
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 21:17:13 -05:00
Benjamin Walsh
602295a88f dlist: fix SYS_DLIST_FOR_EACH_SAFE when operating on empty list
There was no check to see if the head of a list was empty before trying
to fetch the next node in the list. The fix is added to
sys_dlist_peek_next() so that it also return NULL if the node parameter
is NULL, in addition to being the tail of the list.

Since the value is not used until the second iteration of the loop, and
there will be no second iteration if the list is empty, as long as the
CPU does allow reading at address 0, this was not causing any issues.

Our ARC targets did not seem to like that.

Fixes ZEP-1263 and ZEP-1297.

Change-Id: I07ca16592d206d13662226d1249f487ee78c06aa
Signed-off-by: Benjamin Walsh <walsh.benj@gmail.com>
2016-11-17 21:17:13 -05:00
Flavio Santes
829c3ceb12 tests/tinycrypt: Fix wrong sizeof argument
Fix the issue reported by Coverity: wrong sizeof argument.

Coverity-CID: 152042

Change-Id: I5d593ba54bf8f69f3c9d41a8b2878827d1cc186a
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 21:17:12 -05:00
Anas Nashif
9f36bbc07a kernel: event_logger: use POST_KERNEL instead of NANOKERNEL
NANOKERNEL is obsolete and this kernel service is still using it causing
deperecaton warnings. Move it to POST_KERNEL

Change-Id: I17fabd080645f93a8599f4ea25da844e1ec5f4bb
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-17 21:17:12 -05:00
Anas Nashif
145a4c93fa Revert "build: Handle ALL_LIBS dependencies correctly"
This reverts commit 608abd987c.

This change is breaking build dependencies.

Change-Id: Id8e9dbfc14b72933c402d25847615cddbfaca40d
Jira: ZEP-1291
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-17 21:17:12 -05:00
Jithu Joseph
0ce96e850d tests: libs: Fix string overflow
This fixes a string overflow past the end of a buffer
which was reported by coverity.

Coverity-CID: 152044

Change-Id: I5b331135e338fa43b5589a9488b06367e8cad5a7
Signed-off-by: Jithu Joseph <jithu.joseph@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
27d7a9e29f drivers: cc2520: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: If26c881d207a6cedc52b7589c5d7ebb2040c7ab7
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
391ec95f38 net: ip: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I92ca14b7a2c0507a86a6b6abaa567a5091622ad1
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
cb409a67fd samples: net: fix a memcmp len error
The memcmp is a comparison between two strings or buffers.
So, the length should be the buffer length, not the length
of the pointer to the buffer. This was caught by LLVM.

Jira: ZEP-1179

Change-Id: I7fd6b199686b19e7f4a2e1288897483e69ad091e
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
cf4ce62590 net: 802.15.4: Fix a variable type mismatching issue
This variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I891dc9d55055292e6a749f300e995798040d0b24
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Jithu Joseph
3188ed4e64 samples :usb : Check return value fix
This commit fixes a missing function return check reported by
Coverity.

Coverity-CID: 151949

Change-Id: Iedf090b7f2ded9f20ff6d796f1cd5c02990b0a4e
Signed-off-by: Jithu Joseph <jithu.joseph@intel.com>
2016-11-17 21:17:12 -05:00
Sergio Rodriguez
f0523be409 samples: drivers: gpio: Exit from testcase if device not found
Exiting from the test case when the gpio device is not found, this
to avoid a null pointer dereference

This issue was reported by Coverity

Coverity-CID: 151980

Change-Id: I44f13131d44c7c093781e1f11f8481e7ef8175c9
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
db089efe48 tests: benchmark: fix a string format issue
A popular issue "format is not a string literally" was
caught by LLVM. Let's make it a string literally.

Jira: ZEP-1179

Change-Id: I2b4a5aef750b772504bf0e6f005dab2ff9ac3e7c
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Sergio Rodriguez
628ecfa4d7 samples: drivers: i2c fram : Exit from testcase if device not found
Exiting from the test case when the gpio device is not found, this
to avoid a null pointer dereference

This issue was reported by Coverity

Coverity-CID: 151982

Change-Id: Ifaed47b2b48359dacfdb3111ca2895d5912779e6
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
f8d3ac0130 drivers: rtc: fix enum type mismatching issue
The enum type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I50b68e201ef6fb18a02eeda2a2e7548dad3f358c
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Baohong Liu
853c11885c tests: spi: add return value check
Add function return value check. This was caught by
Coverity.

Coverity-CID: 151950

Change-Id: Iee550e15d124f05f0b0514fdad22d06c617beac2
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-17 21:17:12 -05:00
Benjamin Walsh
f1c880aa8a kernel/arm: fix race condition when setting _Swap() return value
There was a possible race condition when setting the return value of a
thread that is pending, from an ISR.

A kernel function causes a thread to pend, with the following series of
steps:

- disable interrupts
- move current thread to wait_q
- call _Swap

Depending if running on M3/4 or M0+, _Swap will either issue a svc #0,
or pend PendSV directly. The same problem exists in both cases.

M3/4:
__svc will:
- enable interrupts
- trigger __pendsv

M0+:
_Swap() will enable interrupts.

__pendsv will:
- save register context including PSP into the thread struct

If an interrupt occurs between interrupts being enabled them and
__pendsv saving PSP, and the ISR sets the pending thread's return value,
this will happen:

- sees the thread in a wait_q
- removes it
- makes it ready
- calls _set_thread_return_value
- _set_thread_return_value looks at the thread's saved PSP to poke
  the value

In this scenario, PSP hasn't yet been updated by __pendsv so it's a
stale value from the previous context switch, resulting in unpredictable
word on the stack getting set to the return value.

There is no way to fix this issue and still have the return value being
delivered directly in the pending thread's exception stack frame, in the
M0+ case. There will always be a window between the unlocking of
interrupts and PendSV being handled. On M3/4, it could be possible with
the mix of SVC and PendSV, since the exception stack frame is created in
the __svc handler. However, because we want to keep the two
implementations as close as possible, and there were talks of moving
M3/4 to using PendSV only, to save an exception, the approach taken
solves both cases.

The approach taken is similar to the ARC and Nios2 ports, where
there is a field in the thread structure that holds the return value.
_Swap() then loads r0/a1 with that value just before returning.

Fixes ZEP-1289.

Change-Id: Iee7e06fe3f8ded84aff918fd43408c7f589344d9
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-17 11:03:40 -05:00
Sergio Rodriguez
653328c6e8 samples: aio comparators: Use expected pointer type in printf
The data structure member being used  is character array,
dereferencig this array gives **char instead of the expected
*char type.

This issue was reported by Coverity

Coverity-CID: 152030
Coverity-CID: 152033

Change-Id: Ied67e4b2d47017e6ad5e40b9b6fca1b496c483ed
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 11:03:40 -05:00
Sergio Rodriguez
d88617db9b drivers: gpio: Remove contradictory if statement evaluation
This fixes an always false evaluation of the gpio I/O direction

This issue was reported by Coverity (CID 150821).

Change-Id: I6c0e9fe405cbd3e35454a81754fa0b1c721691f0
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 11:03:40 -05:00
Sergio Rodriguez
385770cf21 drivers: gpio_ss: Remove contradictory if statement evaluation
This fixes an always false evaluation of the gpio I/O direction

This issue was reported by Coverity

Coverity-CID: 151833

Change-Id: Ie952d6f50c0383d5631325b69e8e8b234c67c4b8
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 11:03:40 -05:00
Sergio Rodriguez
cf00c1c184 drivers: gpio_k64: Remove contradictory if statement evaluation
This fixes an always false evaluation of the gpio I/O direction

This issue was reported by Coverity

Coverity-CID: 151834

Change-Id: I033e368b2e91d888f2e8a797490df757513c3906
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-17 11:03:40 -05:00
Flavio Santes
6a7b6679b1 sensors/mcp9808: Evaluate sensor_channel_get return code
The sensor_channel_get return code is now evaluated.

Change-Id: Ib931d6caba65af7195bad53c62e6e5a3033b49e8
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 11:03:40 -05:00
Flavio Santes
3893503f49 sensors/mcp9808: Evaluate sensor_sample_fetch return code
sensor_sample_fetch return code is now evaluated.

Coverity-CID: 151957

Change-Id: I79b9f44c79ac13e8d7da55c9e3866ad504a4a450
Signed-off-by: Flavio Santes <flavio.santes@intel.com>
2016-11-17 11:03:40 -05:00
Luiz Augusto von Dentz
b42719243d Bluetooth: GATT: Fix using out of scope variable
This fixes defect found by coverity: 152027 Pointer to local outside
scope.

Change-Id: I50f196a04363ffa6e6654b71a9a1d89034580413
Signed-off-by: Luiz Augusto von Dentz <luiz.von.dentz@intel.com>
2016-11-17 16:02:30 +00:00
Allan Stephens
fd45cb4567 kernel: Enhance naming of memory pool configuration options
Replaces confusing (and excessively long) configuration option
names with more intuitive names. Also enhances the description
of each option to clarify its use.

Change-Id: If4d4541407627482b1e90302cfc9df3bc8130d44
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-16 18:06:57 -05:00
Allan Stephens
9f0045a30a doc: Incorporate kernel APIs into API documentation guide
Change-Id: Ib5e5aa14534af4789d8247e6096913e09731f5bb
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-16 18:06:57 -05:00
Allan Stephens
662c8bee81 doc: Various corrections to Kernel Primer
* Ensures all references to kernel functions are correctly
  tagged so they will auto-link to the API guide.

* Adds references to a few functions and macros that were
  omitted.

Change-Id: I26ccd9c29ea123db2807f2df4d05d574932c6849
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-16 18:06:57 -05:00
Allan Stephens
6d0fa01492 doc: Various corrections to doxygen info for Kernel APIs
Most kernel APIs are now ready for inclusion in the API guide.
The APIs largely follow a standard template to provide users
of the API guide with a consistent look-and-feel.

Change-Id: Ib682c31f912e19f5f6d8545d74c5f675b1741058
Signed-off-by: Allan Stephens <allan.stephens@windriver.com>
2016-11-16 18:06:57 -05:00
Iván Briano
afe118fa1d drivers spi_ss: Fix setting of wrong config for SPI 1
Jira: ZEP-1287

Change-Id: I3678631aa5843e769b8e1611734767fa6264b9af
Signed-off-by: Iván Briano <ivan.briano@intel.com>
2016-11-16 14:35:35 +00:00
Vincenzo Frascino
247a2a0671 console: Fix unreachable code condition
This patch fixes an unreachable code condition in the uart_console
driver.

If UART_CONSOLE_DEBUG_SERVER_HOOKS was not defined
handled_by_debug_server in console_out was always 0.

This issue was reported by Coverity (CID 131627).

Change-Id: I4376c3e5b3e68220218df6aabd91b6a8900ca31f
Signed-off-by: Vincenzo Frascino <vincenzo.frascino@linaro.org>
2016-11-16 08:32:25 -05:00
Vincenzo Frascino
cb9033d10f sensor: Fix Unchecked return value in bma280 driver
This patch fixes two "Unchecked return value" conditions into the bma280
driver.

The issue was reported by Coverity (CID 151953).

Change-Id: I2e595b67619411594cec527f358f6c3d3d034550
Signed-off-by: Vincenzo Frascino <vincenzo.frascino@linaro.org>
2016-11-16 08:31:24 -05:00
Sergio Rodriguez
bc7e0455c8 tests: crypto: Fix unchecked return value on CTR PRNG test case
This issue was reported by Coverity (CID 151952)

Change-Id: I59a20a3ccbe606ef634db98ac6cc6889a3973ec3
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-16 07:39:31 -05:00
Sergio Rodriguez
71e85e390b drivers: pwm: Fix uninitialized pointer
This fixes an uninitialized pointer being pass and evaluated by
a subsequent function

This issue was reported by Coverity (CID 150824)

Change-Id: If1f636a44cc675b56e426b1de85895b74ba7105e
Signed-off-by: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
2016-11-16 07:39:31 -05:00
Johan Hedberg
eee3a430dc Bluetooth: Use convenience macros for timeout durations
Using the K_* macros makes it easier to read what exactly the various
timeouts are.

Change-Id: Ia405d3760b8e600af7e33a7221ef6ec717708973
Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2016-11-16 07:11:21 -05:00
Vinayak Chettimada
32ef1480e9 Bluetooth: Controller: Remove unused util functions
Change-id: I7b691d082d080239c35b63221e3c6c7aa93ed58e
Signed-off-by: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
2016-11-16 07:10:51 -05:00
Vinayak Chettimada
39410d79fe Bluetooth: Controller: Fix incorrect irq priority check
External interrupts are indexed from value 16, wherein
0 to 15 are ARM cortex M exceptions. Fixed code in
_irq_is_priority_equal to fetch correct external
interrupt line ISR priority.

Change-id: I9cfd411480e78dfc9635e72d14df9d667a9d8400
Signed-off-by: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
2016-11-16 07:05:58 -05:00
Ramesh Thomas
40c944f0b6 tests: power_states: Update testcase.ini to include arc
testcase.ini was not building for ARC. This app would
run on x86 and arc.

Change-Id: I961d56079aa1db7d84e0fcc87780ba11d7f4d831
Signed-off-by: Ramesh Thomas <ramesh.thomas@intel.com>
2016-11-15 21:23:59 -05:00
Ramesh Thomas
efcdfce517 samples: power_mgmt: Remove platform filtering of testcases
Remove redundant platform filtering and only use SOC filtering

Change-Id: Ib823e076a874ce61a235eca63eebb7f19d2fdd30
Signed-off-by: Ramesh Thomas <ramesh.thomas@intel.com>
2016-11-15 21:23:59 -05:00
Baohong Liu
e50f05df3e samples: button: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I084406601badc64c257cbdd82b9c8b7509549303
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-15 21:09:46 -05:00
Baohong Liu
3f0fcedf00 drivers: bmi160: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I1193a946ea5814510e6c07668c5d05a5d91445a8
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-15 21:08:44 -05:00
Baohong Liu
baab38500c samples: usb: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I402c348af142342e37e93619c4da6e3a5bfd82da
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-15 20:38:34 -05:00
Sergey Kiselev
06869d4499 sensors: bme280: fix typo in reading trimming parameters
Change-Id: I32e72c2845bd06b10585ac8048f67ac754c2a6d6
Signed-off-by: Sergey Kiselev <sergey.kiselev@intel.com>
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-15 20:38:17 -05:00
Vincenzo Frascino
de55b9f73a sensor: Fix Unchecked return value issues in bme280 driver
This patch fixes unchecked return value conditions in the bme280
driver.

This issue was reported by Coverity (CID 151961, 151959, 151955).

Change-Id: I3a2dfbabd41ae52b00fa512a40e00c2e36c3b5ca
Signed-off-by: Vincenzo Frascino <vincenzo.frascino@linaro.org>
2016-11-15 19:43:09 -05:00
Vincenzo Frascino
eb7910b9d1 sensor: Fix less-than-zero comparison in bmi160 driver
This patch fixes a less-than-zero comparison of an unsigned value
condition present in bmi160 driver.

This issue was reported by Coverity (CID 152002, 152003).

Change-Id: I703066519652ac1ecdd9ddf7e97ec7dcbe2a9e27
Signed-off-by: Vincenzo Frascino <vincenzo.frascino@linaro.org>
2016-11-15 19:43:09 -05:00
Ramesh Thomas
95f8f6f3e0 samples: power_mgmt: Remove redundant sample power_hooks
This sample was created intially when there were no other
samples in place to enable the CONFIG flags to build code
inside those flags. However, those CONFIG flags are now
guarded with corresponding "SUPPORTED" flags which are
enabled based in Kconfigs of socs based on their support
for that power feature. This app is for x86 and those
features will not get enabled for this configuration. If
it is still required, then we would need to fake such
support in Kconfig.board of qemu_x86. Removing it, because
those flags will get enabled by sample and test apps of
socs that support the power features, causing code inside
them to get built.

Change-Id: I647be9289a49d69880811abee499a4efd61bbc6a
Signed-off-by: Ramesh Thomas <ramesh.thomas@intel.com>
2016-11-15 19:43:09 -05:00
Ramesh Thomas
b65e208171 samples: power_mgmt: Cleanup and update with new pm interface
Cleaned up and removed some unnecessary code to avoid
distraction from main sample implementation. Updated some
logic based on new PM interface in soc area. Updated README
to indicate it supports x86 and ARC and updated sample
output of both architectures.

Change-Id: I1c9c8348dae403b7ca6fe17ab867e3fbef06ae60
Signed-off-by: Ramesh Thomas <ramesh.thomas@intel.com>
2016-11-15 19:43:09 -05:00
Baohong Liu
2ccf0ad045 tests: spi_test: fix variable type mismatching issue
The variable type mismatching was caught by LLVM.

Jira: ZEP-1179

Change-Id: I37934ef2ee47c521a78086564876843794688d55
Signed-off-by: Baohong Liu <baohong.liu@intel.com>
2016-11-15 19:43:09 -05:00
Benjamin Walsh
efc7ffde75 kernel/arm: fix missing interrupt lock around _is_next_thread_current()
This reverts commit

	"kernel/arm: add comment about _is_next_thread_current"

and fixes the interrupt locking issue.

The comment would have been right if only reads were done the ready
queue, but that is not the case. It turns out that the comment was written
ignoring the fact that _is_next_thread_current() updates the next thread
cache when fetching the next thread.

Change-Id: I21c9230f85f4f87a6bbf14fd4a9eb7e19b59f8c5
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
2016-11-15 19:17:06 -05:00
Szymon Janc
bee0fd0601 Bluetooth: Fix address type use for passive scanning
This fix using incorrect address type for passive scanning with
privacy enabled. Controller was not reporting directed advertising
to RPA address due to public type being used for passive scan.

This was affecting TC_CONN_GCEP_BV_01_C, TC_CONN_ACEP_BV_01_C and
TC_CONN_DCEP_BV_01_C qualification test cases.

Jira: ZEP-1200

Change-Id: Icc316441fcac1a72d75f9ade27a99030efc846b9
Signed-off-by: Szymon Janc <ext.szymon.janc@tieto.com>
2016-11-15 19:12:07 -05:00
Mariusz Skmara
c6c73c1b2e Bluetooth: Fix not sending L2CAP Connection Parameters Update Request
This fixes issue that L2CAP Connection Parameters Update Request was
not sent. There was check that used LE features of host controller
to determine if L2CAP procedure or LL shall be used. It was failing
with 4.2 controller. The check shall test if remote supports
LL Connection Parameters Request Procedure. If it's not supported,
then L2CAP Connection Parameters Update Procedure will be used.

Closes ZEP-1220

1/4   L2CAP   TC_LE_CPU_BV_01_C      PASS
2/4   GAP     TC_CONN_CPUP_BV_01_C   PASS
3/4   GAP     TC_CONN_CPUP_BV_02_C   PASS
4/4   GAP     TC_CONN_CPUP_BV_03_C   PASS

Change-Id: I61ad544d9568ca6306a845e05c1a2e28d1693ab4
Signed-off-by: Mariusz Skamra <mariusz.skamra@tieto.com>
2016-11-15 19:11:21 -05:00
Vinayak Chettimada
ae495b7618 Bluetooth: Controller: Fix incorrect auto variable init
Coverity analysis discovered NULL pointer being
dereferenced when passing a auto variable. The variable is
now correctly assigned with address of a valid default
value variable. As per design, the dereferencing will not
happen as the master role does not use the passed parameter
only slave role uses it to prepare the connection parameter
request PDU.

Change-id: I3f8519b23a83cb8c50c7fba81810eff7737ff74a
Signed-off-by: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
2016-11-15 19:11:21 -05:00
Vinayak Chettimada
3b22494192 Bluetooth: Controller: Fix observer filter_policy field size
Coverity analysis discovered that observer filter policy
field was 1 bit, whereas valid range for extended scanner
filter policy feature implemented in controller is 0 to 3.
Increase the bit field size from 1 to 2.

Change-Id: Id4b2e354961dfb3b45f72fa4e0ab18de7425bbb5
Signed-off-by: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
2016-11-15 19:11:21 -05:00
Vinayak Chettimada
c4a5b9c74e Bluetooth: Controller: Fix HCI Reset Command implementation
Added implementation for HCI Reset Command. Implementation
gracefully disables any running advertiser, observer, and/
or connection roles, and it resets controller context members.
The HCI Reset Command is implemented in such a way that
driver instances shared with other sub-systems and
application is not disturbed and instance/references used
by Bluetooth Controller are gracefully returned back.

Jira: ZEP-1282

Change-id: Ifb9ae6807736b5ec2d9f346cf2a590322056bcee
Signed-off-by: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Signed-off-by: Carles Cufi <carles.cufi@nordicsemi.no>
2016-11-15 19:11:21 -05:00
Szymon Janc
ca23390e84 Bluetooth: Kconfig: Fix BR/EDR dependencies
BR/EDR code should have minimal impact on LE code so to keep it simple
just require peripheral and central to be enabled when selecting BR/EDR
support.

Fix following Kconfig warning:

warning: (NETWORKING_WITH_BT && BLUETOOTH_BREDR) selects
    BLUETOOTH_L2CAP_DYNAMIC_CHANNEL which has unmet direct dependencies
    (BLUETOOTH && BLUETOOTH_HCI && BLUETOOTH_HCI_HOST && BLUETOOTH_CONN
    && BLUETOOTH_SMP)

Change-Id: I7f7cb8794def0df6daaa4abfe4596df460f1a2b2
Signed-off-by: Szymon Janc <ext.szymon.janc@tieto.com>
2016-11-15 13:28:21 -05:00
Gil Pitney
2644d370c8 cc3200: Remove CPU_HAS_FPU from cc3200 Kconfig.soc
Though Cortex-M4 could optionally have a floating point unit,
the MCU in the cc3200 in fact does not have an FPU.

Enabling CPU_HAS_FPU caused applications built with CONFIG_FLOAT=y
to crash during an early call to enable_floating_point().

This patch was validated by running microPython, which is one
such application.

Change-Id: I8bfd42c456524e152cbbb983001d9540d93fbe98
Signed-off-by: Gil Pitney <gil.pitney@linaro.org>
2016-11-15 16:43:32 +00:00
Benjamin Walsh
bef829cfc0 kernel: remove last instances of tNANO in comments
Change-Id: I3d533b819422d4b754afb81d3ea67c03bc7f5630
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
(cherry picked from commit 59a382e339)
2016-11-15 01:30:21 +00:00
Benjamin Walsh
c625aa2636 kernel/arm: add comment about _is_next_thread_current
Normally, _is_next_thread_current() must be called with interrupts
locked, but the ARM interrupt exit code does not have to do that. Add
explanation why.

Change-Id: Id383b47a055fdd6fbd5afffa52772e92febde98f
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
(cherry picked from commit dfa7ca4ee5)
2016-11-15 01:30:13 +00:00
Benjamin Walsh
4329e5e24a kernel: fix typo in comment
Change-Id: I1919fd7b0ae3cb3ac434acc2dceddf3afb4a975b
Signed-off-by: Benjamin Walsh <benjamin.walsh@windriver.com>
(cherry picked from commit ba26678fc6)
2016-11-15 01:30:06 +00:00
Inaky Perez-Gonzalez
4e4ac94f90 tests/drivers/adc: move to ztest to actually test
This TC is only exercising the API, as we don't have a feedback loop
mechanism to verify whichever values are fed to the ADC.

Fixed the loop to complete after 10 runs; on each run, print the
values and actually report the difference between them. With no inputs
connected (aka: floating), they should be reporting noise relatively
close to the previous reading, so we might want to use this delta as a
testing pattern (assert if the delta is higher than some value, but
I've seen variations as high as 40M units). For now, the test is just
happy with being able to read them.

The buffer has been re-typed to uint32_t so we can iterate over it
without casting tricks -- it requires then only a single cast when
initializing sample.buffer (which shall be a void* anyway).

Duplicated the buffer, so we can flip/flop between two buffers to
compare against the entries read in the previous run.

v4: fixed missed warnings

Change-Id: If6b48b92231007202d74f5c042f6d0cf3fdcb60a
Signed-off-by: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
(cherry picked from commit 9e1df6f21f)
2016-11-15 00:47:57 +00:00
Szymon Janc
8d10dc63fc Bluetooth: Fix use of deprecated PRIMARY init level
Fix following warning:

  CC      subsys/bluetooth/host/monitor.o
In file included from zephyr/include/drivers/loapic.h:58:0,
                 from zephyr/include/drivers/ioapic.h:22,
                 from zephyr/include/drivers/sysapic.h:20,
                 from zephyr/include/arch/x86/irq_controller.h:33,
                 from zephyr/include/arch/x86/arch.h:28,
                 from zephyr/include/arch/cpu.h:23,
                 from zephyr/include/kernel.h:2458,
                 from zephyr/include/zephyr.h:20,
                 from zephyr/subsys/bluetooth/host/monitor.c:24:
zephyr/subsys/bluetooth/host/monitor.c: In function
    '_deprecation_check_sys_init_bt_monitor_init0':
zephyr/include/device.h:130:16: warning: '_INIT_LEVEL_PRIMARY' is
    deprecated [-Wdeprecated-declarations]
  static struct device_config _CONCAT(__config_, dev_name) __used \
                ^
zephyr/include/device.h:245:2: note: in expansion of macro
    'DEVICE_AND_API_INIT'
  DEVICE_AND_API_INIT(dev_name, drv_name, init_fn, data, cfg_info, \
  ^
zephyr/include/init.h:69:2: note: in expansion of macro 'DEVICE_INIT'
  DEVICE_INIT(_SYS_NAME(init_fn), "", init_fn, NULL, NULL, level, prio)
  ^
zephyr/subsys/bluetooth/host/monitor.c:193:1: note: in expansion of
    macro 'SYS_INIT'
 SYS_INIT(bt_monitor_init, PRIMARY, MONITOR_INIT_PRIORITY);
 ^
zephyr/include/device.h:48:31: note: declared here
 static __deprecated const int _INIT_LEVEL_PRIMARY = 1;

Change-Id: I0960bfddddfd1105daf3bb8cc1114e9a25840f2c
Signed-off-by: Szymon Janc <ext.szymon.janc@tieto.com>
2016-11-15 00:46:23 +00:00
Johan Hedberg
ce596d3c54 Bluetooth: doc: Fix reference to documentation location
The Bluetooth documentation is found in doc/subsystems/bluetooth and
not in doc/bluetooth.

Change-Id: I7e7010b5ae4a26ea552d75f1a095baec18d02630
Signed-off-by: Johan Hedberg <johan.hedberg@intel.com>
2016-11-14 10:49:53 +02:00
Anas Nashif
61b596b0e5 Zephyr 1.6.0-rc1
First release candidate of 1.6.

Change-Id: I01e8524c163f7e984405e2feecc0118db1605ab8
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
2016-11-13 10:16:28 -05:00
59014 changed files with 2333951 additions and 5932119 deletions

View File

@@ -1,32 +1,22 @@
--mailback
--no-tree
--emacs
--summary-file
--show-types
--max-line-length=100
--max-line-length=80
--min-conf-desc-length=1
--typedefsfile=scripts/checkpatch/typedefsfile
--ignore BRACES
--ignore PRINTK_WITHOUT_KERN_LEVEL
--ignore SPLIT_STRING
--ignore VOLATILE
--ignore CONFIG_EXPERIMENTAL
--ignore PREFER_KERNEL_TYPES
--ignore PREFER_SECTION
--ignore AVOID_EXTERNS
--ignore NETWORKING_BLOCK_COMMENT_STYLE
--ignore DATE_TIME
--ignore MINMAX
--ignore CONST_STRUCT
--ignore FILE_PATH_CHANGES
--ignore SPDX_LICENSE_TAG
--ignore C99_COMMENT_TOLERANCE
--ignore REPEATED_WORD
--ignore UNDOCUMENTED_DT_STRING
--ignore DT_SPLIT_BINDING_PATCH
--ignore DT_SCHEMA_BINDING_PATCH
--ignore TRAILING_SEMICOLON
--ignore COMPLEX_MACRO
--ignore MULTISTATEMENT_MACRO_USE_DO_WHILE
--ignore ENOSYS
--ignore IS_ENABLED_CONFIG
--ignore EXPORT_SYMBOL
--ignore COMPARISON_TO_NULL
--exclude ext
--exclude net/ip/contiki

View File

@@ -1,116 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Note: The list of ForEachMacros can be obtained using:
#
# git grep -h '^#define [^[:space:]]*FOR_EACH[^[:space:]]*(' include/ \
# | sed "s,^#define \([^[:space:]]*FOR_EACH[^[:space:]]*\)(.*$, - '\1'," \
# | sort | uniq
#
# References:
# - https://clang.llvm.org/docs/ClangFormatStyleOptions.html
---
BasedOnStyle: LLVM
AlignConsecutiveMacros: AcrossComments
AllowShortBlocksOnASingleLine: Never
AllowShortCaseLabelsOnASingleLine: false
AllowShortEnumsOnASingleLine: false
AllowShortFunctionsOnASingleLine: None
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AttributeMacros:
- __aligned
- __deprecated
- __packed
- __printf_like
- __syscall
- __syscall_always_inline
- __subsystem
BitFieldColonSpacing: After
BreakBeforeBraces: Linux
ColumnLimit: 100
ConstructorInitializerIndentWidth: 8
ContinuationIndentWidth: 8
ForEachMacros:
- 'ARRAY_FOR_EACH'
- 'ARRAY_FOR_EACH_PTR'
- 'FOR_EACH'
- 'FOR_EACH_FIXED_ARG'
- 'FOR_EACH_IDX'
- 'FOR_EACH_IDX_FIXED_ARG'
- 'FOR_EACH_NONEMPTY_TERM'
- 'FOR_EACH_FIXED_ARG_NONEMPTY_TERM'
- 'RB_FOR_EACH'
- 'RB_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_DLIST_FOR_EACH_NODE'
- 'SYS_DLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SEM_LOCK'
- 'SYS_SFLIST_FOR_EACH_CONTAINER'
- 'SYS_SFLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SFLIST_FOR_EACH_NODE'
- 'SYS_SFLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SLIST_FOR_EACH_CONTAINER'
- 'SYS_SLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SLIST_FOR_EACH_NODE'
- 'SYS_SLIST_FOR_EACH_NODE_SAFE'
- '_WAIT_Q_FOR_EACH'
- 'Z_FOR_EACH'
- 'Z_FOR_EACH_ENGINE'
- 'Z_FOR_EACH_EXEC'
- 'Z_FOR_EACH_FIXED_ARG'
- 'Z_FOR_EACH_FIXED_ARG_EXEC'
- 'Z_FOR_EACH_IDX'
- 'Z_FOR_EACH_IDX_EXEC'
- 'Z_FOR_EACH_IDX_FIXED_ARG'
- 'Z_FOR_EACH_IDX_FIXED_ARG_EXEC'
- 'Z_GENLIST_FOR_EACH_CONTAINER'
- 'Z_GENLIST_FOR_EACH_CONTAINER_SAFE'
- 'Z_GENLIST_FOR_EACH_NODE'
- 'Z_GENLIST_FOR_EACH_NODE_SAFE'
- 'STRUCT_SECTION_FOREACH'
- 'STRUCT_SECTION_FOREACH_ALTERNATE'
- 'TYPE_SECTION_FOREACH'
- 'K_SPINLOCK'
- 'COAP_RESOURCE_FOREACH'
- 'COAP_SERVICE_FOREACH'
- 'COAP_SERVICE_FOREACH_RESOURCE'
- 'HTTP_RESOURCE_FOREACH'
- 'HTTP_SERVER_CONTENT_TYPE_FOREACH'
- 'HTTP_SERVICE_FOREACH'
- 'HTTP_SERVICE_FOREACH_RESOURCE'
- 'I3C_BUS_FOR_EACH_I3CDEV'
- 'I3C_BUS_FOR_EACH_I2CDEV'
- 'MIN_HEAP_FOREACH'
IfMacros:
- 'CHECKIF'
# Disabled for now, see bug https://github.com/zephyrproject-rtos/zephyr/issues/48520
#IncludeBlocks: Regroup
IncludeCategories:
- Regex: '^".*\.h"$'
Priority: 0
- Regex: '^<(assert|complex|ctype|errno|fenv|float|inttypes|limits|locale|math|setjmp|signal|stdarg|stdbool|stddef|stdint|stdio|stdlib|string|tgmath|time|wchar|wctype)\.h>$'
Priority: 1
- Regex: '^\<zephyr/.*\.h\>$'
Priority: 2
- Regex: '.*'
Priority: 3
IndentCaseLabels: false
IndentGotoLabels: false
IndentWidth: 8
InsertBraces: true
InsertNewlineAtEOF: true
SpaceBeforeInheritanceColon: False
SpaceBeforeParens: ControlStatementsExceptControlMacros
SortIncludes: Never
UseTab: ForContinuationAndIndentation
WhitespaceSensitiveMacros:
- COND_CODE_0
- COND_CODE_1
- IF_DISABLED
- IF_ENABLED
- LISTIFY
- STRINGIFY
- Z_STRINGIFY
- DT_FOREACH_PROP_ELEM_SEP

View File

@@ -1,21 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Copyright (c) 2024, Basalte bv
analyzer:
# Start by disabling all
- --disable-all
# Enable the sensitive profile
- --enable=sensitive
# Disable unused cases
- --disable=boost
- --disable=mpi
# Many identifiers in zephyr start with _
- --disable=clang-diagnostic-reserved-identifier
- --disable=clang-diagnostic-reserved-macro-identifier
# Cleanup
- --clean

View File

@@ -1,31 +0,0 @@
codecov:
notify:
require_ci_to_pass: yes
coverage:
precision: 2
round: down
range: "70...100"
status:
project: yes
patch: yes
changes: no
# ignore:
# - "tests/**/*"
# - "samples/**/*"
# - "ext/hal/**/*"
parsers:
gcov:
branch_detection:
conditional: yes
loop: yes
method: no
macro: no
comment:
layout: "reach, diff, flags, files, footer"
behavior: default
require_changes: no

View File

@@ -1,96 +0,0 @@
# EditorConfig: https://editorconfig.org/
# top-most EditorConfig file
root = true
# All (Defaults)
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
max_line_length = 100
# Assembly
[*.S]
indent_style = tab
indent_size = 8
# C
[*.{c,h}]
indent_style = tab
indent_size = 8
# C++
[*.{cpp,hpp}]
indent_style = tab
indent_size = 8
# Linker Script
[*.ld]
indent_style = tab
indent_size = 8
# Python
[*.py]
indent_style = space
indent_size = 4
# Perl
[*.pl]
indent_style = tab
indent_size = 8
# reStructuredText
[*.rst]
indent_style = space
indent_size = 3
# YAML
[*.{yml,yaml}]
indent_style = space
indent_size = 2
# Shell Script
[*.sh]
indent_style = space
indent_size = 4
# Windows Command Script
[*.cmd]
end_of_line = crlf
indent_style = tab
indent_size = 8
# Valgrind Suppression File
[*.supp]
indent_style = space
indent_size = 3
# CMake
[{CMakeLists.txt,*.cmake}]
indent_style = space
indent_size = 2
# Makefile
[Makefile]
indent_style = tab
indent_size = 8
# Device tree
[*.{dts,dtsi,overlay}]
indent_style = tab
indent_size = 8
# Git commit messages
[COMMIT_EDITMSG]
max_line_length = 75
# Patches
[{*.patch,*.diff}]
trim_trailing_whitespace = false
# Kconfig
[Kconfig*]
indent_style = tab
indent_size = 8

8
.gitattributes vendored
View File

@@ -3,11 +3,3 @@
.gitattributes export-ignore
.gitignore export-ignore
.mailmap export-ignore
# Tell git to not diff certain files
*.svg -diff
# Tell linguist that generated test pattern files should not be included in the
# language statistics.
*.pat linguist-generated
*.svg linguist-generated

View File

@@ -1,85 +0,0 @@
name: Bug Report
description: File a bug report.
labels: ["bug"]
type: "Bug"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
- type: textarea
id: what-happened
attributes:
label: Describe the bug
description: |
A clear and concise description of what the bug is.
placeholder: |
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
validations:
required: true
- type: checkboxes
id: regression
attributes:
label: Regression
description: |
Check this box if this is a regression and provide a SHA if you were able to "git bisect" to a specific commit.
options:
- label: This is a regression.
required: false
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce
description: |
Steps to reproduce the behavior.
placeholder: |
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
validations:
required: false
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
render: shell
- type: dropdown
attributes:
label: Impact
description: Impact of this bug
multiple: false
options:
- Showstopper Prevents release or major functionality; system unusable.
- Major Severely degrades functionality; workaround is difficult or unavailable.
- Functional Limitation Some features not working as expected, but system usable.
- Annoyance Minor irritation; no significant impact on usability or functionality.
- Intermittent Occurs occasionally; hard to reproduce.
- Not sure
default: 3
validations:
required: true
- type: textarea
id: env
attributes:
label: Environment
description: please complete the following information
placeholder: |
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
- type: textarea
id: context
attributes:
label: Additional Context
description: Provide other context that could be relevant to the bug, such as pin setting, target configuration,etc.

View File

@@ -1,42 +0,0 @@
name: Enhancement
description: Submit an Enhancement
labels: ["Enhancement"]
type: "Enhancement"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this enhancement proposal.
- type: textarea
id: description
attributes:
label: Summary
description: |
Is your enhancement proposal related to a problem? Please describe.
placeholder: |
A clear and concise description of what the problem is.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: |
Describe the solution you'd like
placeholder: |
A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Alternatives
description: Describe alternatives you've considered
placeholder: |
A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
id: context
attributes:
label: Additional Context
description: Add any other context or graphics (drag-and-drop an image) about the enhancement here.

View File

@@ -1,75 +0,0 @@
name: RFC / Proposal
description: Submit a Proposal (RFC)
labels: ["RFC"]
type: RFC
assignees: []
body:
- type: markdown
attributes:
value: |
## Introduction
This section targets end users, TSC members, maintainers and anyone else
that might need a quick explanation of your proposed change.
- type: textarea
id: problem-description
attributes:
label: Problem Description
description: Why do we want this change and what problem are we trying to address?
placeholder: Explain the problem or limitation this RFC is meant to resolve.
validations:
required: true
- type: textarea
id: proposed-change-summary
attributes:
label: Proposed Change (Summary)
description: A high-level summary of the proposed change.
placeholder: Brief summary of what will change if this RFC is implemented.
validations:
required: true
- type: markdown
attributes:
value: |
## Detailed RFC
This section targets the development team. Upon reading it, each engineer
should understand what must be done to implement the proposed feature.
- type: textarea
id: detailed-change
attributes:
label: Proposed Change (Detailed)
description: Describe the change in as much detail as possible. Include context or background info, and reuse of existing components if applicable.
placeholder: Explain exactly what youre planning to change and how.
validations:
required: true
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: Highlight how this change may affect the rest of the project or other teams/components.
placeholder: List components, modules, or teams affected.
validations:
required: false
- type: textarea
id: concerns
attributes:
label: Concerns and Unresolved Questions
description: List any concerns, unknowns, or unresolved questions related to this proposal.
placeholder: Any areas of uncertainty?
validations:
required: false
- type: textarea
id: alternatives
attributes:
label: Alternatives Considered
description: What alternative solutions were considered? Why was this proposal chosen?
placeholder: List alternatives and explain the rationale behind your choice.
validations:
required: false

View File

@@ -1,29 +0,0 @@
name: Feature Request
description: Suggest a new feature or enhancement
labels: ["Feature Request"]
type: Feature
assignees: []
body:
- type: textarea
id: problem
attributes:
label: Is your feature request related to a problem? Please describe.
description: A clear and concise description of what the problem is.
placeholder: e.g., I'm frustrated when I need to do X manually because Y is missing.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: e.g., It would be great if the system could automatically handle X by doing Y.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Describe alternatives you've considered
description: Include any alternative solutions or features

View File

@@ -1,57 +0,0 @@
name: Contributor Nomination
description: Nominate a GitHub user for the Contributor role with triage permissions
labels: [Role Nomination]
assignees: ['nashif']
body:
- type: markdown
attributes:
value: |
## Background
The [TSC Project Roles](https://docs.zephyrproject.org/latest/project/project_roles.html) defines the main roles for the Zephyr Project, including Maintainer, Collaborator, and Contributor.
By default, anyone who contributes code or documentation is a Contributor, but with the lowest [GitHub Permission Level](https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization) of **Read**.
Use this form to nominate a user for the **Contributor** role with **Triage** permission, which allows the user to:
- Add reviewers to pull requests
- Be added as a reviewer by others
- type: input
id: full-name
attributes:
label: Full Name
description: Full name of the nominated contributor.
placeholder: e.g., Jane Doe
validations:
required: true
- type: input
id: github-username
attributes:
label: GitHub Username
description: GitHub handle of the nominated contributor.
placeholder: e.g., @janedoe
validations:
required: true
- type: input
id: organization
attributes:
label: Organization
description: Organization the nominee is affiliated with (optional).
placeholder: e.g., Acme Corp
validations:
required: false
- type: textarea
id: supporting-documents
attributes:
label: Supporting Documents
description: Provide links to 35 pull requests authored or reviewed by the nominee that demonstrate their dedication to the Zephyr project.
placeholder: |
e.g.,
- https://github.com/zephyrproject-rtos/zephyr/pull/12345
- https://github.com/zephyrproject-rtos/zephyr/pull/23456
- https://github.com/zephyrproject-rtos/zephyr/pull/34567
validations:
required: true

View File

@@ -1,106 +0,0 @@
name: External Component Integration
description: Propose integration of an external open source component
labels: ["TSC"]
assignees: []
body:
- type: textarea
id: origin
attributes:
label: Origin
description: Name of project hosting the original open source code. Provide a link to the source.
placeholder: e.g., SQLite - https://sqlite.org
validations:
required: true
- type: textarea
id: purpose
attributes:
label: Purpose
description: Brief description of what this software does.
placeholder: |
e.g., A small, fast, self-contained SQL database engine.
validations:
required: true
- type: textarea
id: integration-mode
attributes:
label: Mode of Integration
description: Should this be integrated in the main tree or as a module? Explain your choice and suggest a module repo name if applicable.
placeholder: |
e.g., As a module - proposed repo name: zephyr-sqlite
validations:
required: true
- type: textarea
id: maintainership
attributes:
label: Maintainership
description: List maintainers (GitHub IDs) for this integration. Include at least one primary maintainer.
placeholder: |
e.g., @username1 (primary), @username2 (collaborator)
validations:
required: true
- type: input
id: pull-request
attributes:
label: Pull Request
description: Link to the pull request (if any) for this integration. Must be labeled "DNM" (Do Not Merge).
placeholder: |
e.g., https://github.com/zephyrproject-rtos/zephyr/pull/12345
validations:
required: false
- type: textarea
id: description
attributes:
label: Description
description: Long-form description to justify suitability of this component.
placeholder: |
- What is its primary functionality?
- What problem does it solve?
- Why is this the right component?
validations:
required: true
- type: textarea
id: security
attributes:
label: Security
description: Security-related aspects of this component, including cryptographic functions and known vulnerabilities.
placeholder: |
- Does it use cryptography?
- How are vulnerabilities handled?
- Any known CVEs?
validations:
required: false
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: What does this component depend on, and how will it be integrated (directly or via abstraction)?
placeholder: |
- Other external packages?
- Direct or abstracted use in Zephyr?
validations:
required: false
- type: input
id: revision
attributes:
label: Version or SHA
description: Which version or specific commit should be initially integrated?
placeholder: e.g., v3.45.0 or 79cc94d
validations:
required: true
- type: input
id: license
attributes:
label: License (SPDX)
description: Provide the license using a valid SPDX identifier (e.g., BSD-3-Clause).
placeholder: e.g., MIT or BSD-3-Clause
validations:
required: true

View File

@@ -1,5 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Zephyr Community Support
url: https://github.com/zephyrproject-rtos/zephyr/discussions
about: Please ask and answer questions here.

22
.github/SECURITY.md vendored
View File

@@ -1,22 +0,0 @@
# Security Policy
## Supported versions
The Zephyr project supports the following versions with security
updates:
- The most recent release, and the release prior to that.
- Active LTS releases.
At this time, with the latest release of v4.3, the supported
versions are:
- v4.3: Current release
- v4.2: Prior release
- v3.7: Current LTS
## Reporting process
Please see our [Security Vulnerability
Reporting](https://docs.zephyrproject.org/latest/security/reporting.html)
page for details on the process.

View File

@@ -1,2 +0,0 @@
paths:
- .github

View File

@@ -1,2 +0,0 @@
paths:
- doc

View File

@@ -1,26 +0,0 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: github: "
labels: []
groups:
actions-deps:
patterns:
- "*"
- package-ecosystem: "uv"
directory: "/doc"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: doc: "
labels: []
groups:
doc-deps:
patterns:
- "*"

View File

@@ -1,16 +0,0 @@
license:
main: apache-2.0
report_missing: true
category: Permissive
copyright:
check: true
exclude:
extensions:
- yml
- yaml
- html
- rst
- conf
- cfg
langs:
- HTML

View File

@@ -1,88 +0,0 @@
name: Pull Request/Issue Assigner
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- ready_for_review
branches:
- main
- collab-*
- v*-branch
issues:
types:
- labeled
permissions:
contents: read
jobs:
assignment:
name: Pull Request/Issue Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-24.04
permissions:
issues: write # to add assignees to issues
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: zephyrproject-rtos/action-python-env@32e53bef090c33d53aa94f1d9a9d29c93cfdc5f7 # main
with:
python-version: 3.12
- name: Fetch west.yml/Maintainer.yml from pull request
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
run: |
git fetch origin pull/${{ github.event.pull_request.number }}/merge
git show FETCH_HEAD:west.yml > pr_west.yml
git show FETCH_HEAD:MAINTAINERS.yml > pr_MAINTAINERS.yml
- name: west setup
if: >
github.event_name == 'pull_request_target'
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"
FLAGS+=" -r ${{ github.event.repository.name }}"
FLAGS+=" -M MAINTAINERS.yml"
if [ "${{ github.event_name }}" = "pull_request_target" ]; then
if [ "${{ github.base_ref }}" != "main" ]; then
FLAGS+=" -P ${{ github.event.pull_request.number }} --updated-manifest pr_west.yml --updated-maintainer-file pr_MAINTAINERS.yml"
else
FLAGS+=" -P ${{ github.event.pull_request.number }}"
fi
elif [ "${{ github.event_name }}" = "issues" ]; then
FLAGS+=" -I ${{ github.event.issue.number }}"
elif [ "${{ github.event_name }}" = "schedule" ]; then
FLAGS+=" --modules"
else
echo "Unknown event: ${{ github.event_name }}"
exit 1
fi
python3 scripts/ci/set_assignees.py $FLAGS
- name: Check maintainer file changes
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
python ./scripts/ci/check_maintainer_changes.py \
--repo zephyrproject-rtos/zephyr MAINTAINERS.yml pr_MAINTAINERS.yml

View File

@@ -1,38 +0,0 @@
name: Backport
on:
pull_request_target:
types:
- closed
- labeled
branches:
- main
permissions:
contents: read
jobs:
backport:
name: Backport
runs-on: ubuntu-24.04
permissions:
contents: write # to create/push backport branches
pull-requests: write # to create backport PRs
issues: write # to add labels to issue created if backport fails
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
github.event.pull_request.merged &&
(
github.event.action == 'closed' ||
(
github.event.action == 'labeled' &&
contains(github.event.label.name, 'backport')
)
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@7e74f601d11eaca577742445e87775b5651a965f # v2.0.3-3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

View File

@@ -1,50 +0,0 @@
name: Backport Issue Check
on:
pull_request_target:
types:
- edited
- opened
- reopened
- synchronize
branches:
- v*-branch
permissions:
contents: read
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
issues: read # to check if associated issue exists for backport
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -1,34 +0,0 @@
name: Publish BabbleSim Tests Results
on:
workflow_run:
workflows: ["BabbleSim Tests"]
types:
- completed
permissions:
contents: read
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-24.04
if: github.event.workflow_run.conclusion != 'skipped'
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: BabbleSim Test Results
comment_mode: off
commit: ${{ github.event.workflow_run.head_sha }}
event_file: event/event.json
event_name: ${{ github.event.workflow_run.event }}
files: "bsim-test-results/**/bsim_results.xml"

View File

@@ -1,212 +0,0 @@
name: BabbleSim Tests
on:
pull_request:
paths:
- ".github/workflows/bsim-tests.yaml"
- ".github/workflows/bsim-tests-publish.yaml"
- "west.yml"
- "subsys/bluetooth/**"
- "tests/bsim/**"
- "boards/nordic/nrf5*/*dt*"
- "dts/*/nordic/**"
- "tests/bluetooth/**"
- "samples/bluetooth/**"
- "boards/native/**"
- "soc/native/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/hal_nordic/**"
- "modules/mbedtls/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
- "include/zephyr/net/openthread.h"
- "drivers/ieee802154/**"
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
- '!**.rst'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
bsim-test:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Environment Setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Check common triggering files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/native/
soc/native/
arch/posix/
include/zephyr/arch/posix/
scripts/native_simulator/
tests/bsim/*
boards/nordic/nrf5*/*dt*
dts/*/nordic/
modules/mbedtls/**
modules/hal_nordic/**
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-bluetooth-files
with:
files: |
samples/bluetooth/
subsys/bluetooth/
tests/bluetooth/
tests/bsim/bluetooth/
- name: Check if Networking files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-networking-files
with:
files: |
tests/bsim/net/
samples/net/sockets/echo_*/
modules/openthread/
subsys/net/l2/openthread/
include/zephyr/net/openthread.h
drivers/ieee802154/
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-uart-files
with:
files: |
tests/bsim/drivers/uart/
drivers/serial/*nrfx*
tests/drivers/uart/
- name: Update BabbleSim to manifest revision
if: >
steps.check-bluetooth-files.outputs.any_modified == 'true'
|| steps.check-networking-files.outputs.any_modified == 'true'
|| steps.check-uart-files.outputs.any_modified == 'true'
|| steps.check-common-files.outputs.any_modified == 'true'
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Bluetooth Tests with BSIM
if: steps.check-bluetooth-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.bt.sh
- name: Run Networking Tests with BSIM
if: steps.check-networking-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.net.sh
- name: Run UART Tests with BSIM
if: steps.check-uart-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.uart.sh
- name: Merge Test Results
run: |
junitparser merge --glob "./bsim_*/*bsim_results.*.xml" "./twister-out/twister.xml" junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: Bsim Test Results
files: "junit.xml"
comment_mode: off
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: event
path: |
${{ github.event_path }}

View File

@@ -1,76 +0,0 @@
# Copyright (c) 2021, 2022 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Make a snapshot of open bugs as a python pickle file, compressed
# using xz. Upload the xz file to Amazon S3.
name: Bug Snapshot
on:
workflow_dispatch:
schedule:
# Run daily at 00:05
- cron: '5 00 * * *'
permissions:
contents: read
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-24.04
if: github.repository_owner == 'zephyrproject-rtos' && github.ref == 'refs/heads/main'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Snapshot bugs
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BUGS_PICKLE_FILENAME="zephyr-bugs-$(date -I).pickle.xz"
BUGS_PICKLE_PATH="${RUNNER_TEMP}/${BUGS_PICKLE_FILENAME}"
set -euo pipefail
python3 scripts/make_bugs_pickle.py | xz > ${BUGS_PICKLE_PATH}
echo "BUGS_PICKLE_FILENAME=${BUGS_PICKLE_FILENAME}" >> ${GITHUB_ENV}
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
run: |
REPOSITORY_NAME="${GITHUB_REPOSITORY#*/}"
PUBLISH_BUCKET="builds.zephyrproject.org"
PUBLISH_DOMAIN="builds.zephyrproject.io"
if [ "${{github.event_name}}" = "schedule" ]; then
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot/daily"
else
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot"
fi
PUBLISH_UPLOAD_URI="s3://${PUBLISH_BUCKET}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
PUBLISH_DOWNLOAD_URI="https://${PUBLISH_DOMAIN}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
aws s3 cp --quiet ${BUGS_PICKLE_PATH} ${PUBLISH_UPLOAD_URI}
echo "Bug pickle is available at: ${PUBLISH_DOWNLOAD_URI}" >> ${GITHUB_STEP_SUMMARY}

View File

@@ -1,184 +0,0 @@
name: Build with Clang/LLVM
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
clang-build:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: [1, 2]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config --global update.narrow true
west config manifest.group-filter -- +ci,+optional
# In some cases modules are left in a state where they can't be
# updated (i.e. when we cancel a job and the builder is killed),
# So first retry to update, if that does not work, remove all modules
# and start over. (Workaround until we implement more robust module
# west caching).
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west2.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
${LLVM_TOOLCHAIN_PATH}/bin/clang --version
gcc --version
ls -la
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Tests with Twister
id: twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=llvm
./scripts/twister -p native_sim --force-color --inline-logs -M -N -v --retry-failed 2 \
-T tests --subset ${{matrix.subset}}/2 -j 16
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
path: |
twister-out/twister.xml
twister-out/twister.json
if-no-files-found: ignore
clang-build-results:
name: "Publish Unit Tests Results"
needs: clang-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create GitHub annotations
if: (success() || failure())
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: artifacts
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit-clang.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit-clang.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
if: always()
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off

View File

@@ -1,275 +0,0 @@
name: Code Coverage with codecov
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
codecov:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["mps2/an385", "native_sim", "qemu_x86", "unit_testing"]
include:
- platform: 'mps2/an385'
normalized: 'mps2_an385'
- platform: 'native_sim'
normalized: 'native_sim'
- platform: 'qemu_x86'
normalized: 'qemu_x86'
- platform: 'unit_testing'
normalized: 'unit_testing'
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resovle the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west update 1> west.update.log || west update 1> west.update-2.log
- name: Environment Setup
run: |
cmake --version
gcc --version
ls -la
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
./scripts/twister --save-tests ${{matrix.normalized}}-testplan.json
ls -la
./scripts/twister \
-i --force-color -N -v --filter runnable -p ${{ matrix.platform }} --coverage \
-T tests --coverage-tool gcovr -xCONFIG_TEST_EXTRA_STACK_SIZE=4096 -e nano \
--timeout-multiplier 2
- name: Build Doxygen Coverage
if: matrix.platform == 'unit_testing'
run: |
pip install -r doc/requirements.txt --require-hashes
sudo apt-get update
sudo apt-get install -y graphviz # dot is needed but currently missing from the Docker image
cmake -B doc/_build -S doc
cmake --build doc/_build --target doxygen-coverage
- name: Upload Doxygen Coverage Results
if: matrix.platform == 'unit_testing'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: doxygen-coverage-results
path: |
doc/_build/new.info
doc/_build/coverage-report
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Rename coverage files
if: always()
run: |
mv twister-out/coverage.json coverage/reports/${{matrix.normalized}}.json
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Coverage Data (Subset ${{ matrix.normalized }})
path: |
coverage/reports/${{ matrix.normalized }}.json
${{ matrix.normalized }}-testplan.json
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-24.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: coverage/reports
- name: Move coverage files
run: |
ls -lRt ./coverage/reports
mv ./coverage/reports/*/*testplan.json .
mv ./coverage/reports/*/coverage/reports/*.json ./coverage/reports
ls -la ./coverage/reports
- name: Generate list of coverage files
id: get-coverage-files
shell: cmake -P {0}
run: |
file(GLOB INPUT_FILES_LIST "coverage/reports/*.json")
set(MERGELIST "")
set(FILELIST "")
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(FILELIST STREQUAL "")
set(FILELIST "${f}")
else()
set(FILELIST "${FILELIST},${f}")
endif()
endforeach()
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(MERGELIST STREQUAL "")
set(MERGELIST "--add-tracefile ${f}")
else()
set(MERGELIST "${MERGELIST} -a ${f}")
endif()
endforeach()
file(APPEND $ENV{GITHUB_OUTPUT} "mergefiles=${MERGELIST}\n")
file(APPEND $ENV{GITHUB_OUTPUT} "covfiles=${FILELIST}\n")
- name: Merge coverage files
run: |
pushd ./coverage/reports
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
popd
- name: Get current date
id: run_date
run: |
echo "run_date=$(date --iso-8601=minutes)" >> "$GITHUB_OUTPUT"
echo "run_date_short=$(date +'%Y-%m-%d')" >> "$GITHUB_OUTPUT"
echo "run_date_year=$(date +'%Y')" >> "$GITHUB_OUTPUT"
echo "run_date_month=$(date +'%m')" >> "$GITHUB_OUTPUT"
- name: Generate Coverage Report
if: always()
run: |
python3 ./scripts/ci/coverage/coverage_analysis.py \
-t native_sim-testplan.json \
-m MAINTAINERS.yml \
-c coverage/reports/merged.json \
-o coverage-report-${{ steps.run_date.outputs.run_date_short }} \
-f all
cp coverage-report-* coverage/reports/
- name: Upload Merged Coverage Results and Report
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Coverage Data and report
path: |
coverage/reports/merged.json
coverage/reports/merged.xml
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.json
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.xlsx
- name: Upload test coverage to Codecov
if: always()
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/merged.xml
flags: unittests-coverage
- name: Upload Doxygen coverage to Codecov
if: always()
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/doxygen-coverage-results/new.info
disable_search: true
flags: doxygen-coverage

View File

@@ -1,58 +0,0 @@
name: "CodeQL"
on:
push:
branches:
- main
- v*-branch
- collab-*
schedule:
- cron: '34 16 * * 6'
pull_request:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
jobs:
analyze:
name: Analyze (${{ matrix.language }})
runs-on: ubuntu-24.04
permissions:
security-events: write
strategy:
fail-fast: false
matrix:
include:
- language: python
build-mode: none
- language: actions
build-mode: none
config: ./.github/codeql/codeql-actions-config.yml
- language: javascript-typescript
build-mode: none
config: ./.github/codeql/codeql-js-config.yml
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
queries: security-extended
config-file: ${{ matrix.config }}
- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo "nothing yet"
exit 0
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,64 +0,0 @@
name: Coding Guidelines
on: pull_request
permissions:
contents: read
jobs:
compliance_job:
runs-on: ubuntu-24.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guidelines Checks
continue-on-error: true
id: coding_guidelines
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
source zephyr-env.sh
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/guideline_check.py --output output.txt -c origin/${BASE_REF}..
- name: check-warns
run: |
if [[ -s "output.txt" ]]; then
errors=$(cat output.txt)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=output.txt::$errors"
exit 1;
fi

View File

@@ -1,138 +0,0 @@
name: Compliance Checks
on:
pull_request:
types:
- edited
- opened
- reopened
- synchronize
permissions:
contents: read
jobs:
check_compliance:
runs-on: ubuntu-24.04
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase onto the target branch
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
git remote -v
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
# debug
git log --pretty=oneline | head -n 10
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Setup Node.js
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
with:
node-version: "lts/*"
cache: npm
check-latest: true
cache-dependency-path: ./scripts/ci/package-lock.json
- name: Install Node dependencies
run: npm --prefix ./scripts/ci ci
- name: Run Compliance Tests
continue-on-error: true
id: compliance
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
# debug
ls -la
git log --pretty=oneline | head -n 10
# Increase rename limit to allow for large PRs
git config diff.renameLimit 10000
excludes="-e KconfigBasic -e SysbuildKconfigBasic -e ClangFormat"
# The signed-off-by check for dependabot should be skipped
if [ "${{ github.actor }}" == "dependabot[bot]" ]; then
excludes="$excludes -e Identity"
fi
./scripts/ci/check_compliance.py --annotate $excludes -c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: compliance.xml
path: compliance.xml
- name: Upload dts linter patch
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
if: hashFiles('dts_linter.patch') != ''
with:
name: dts_linter.patch
path: dts_linter.patch
- name: check-warns
run: |
if [[ ! -s "compliance.xml" ]]; then
exit 1;
fi
warns=("ClangFormat" "LicenseAndCopyrightCheck")
files=($(./scripts/ci/check_compliance.py -l))
for file in "${files[@]}"; do
f="${file}.txt"
if [[ -s $f ]]; then
results=$(cat $f)
results="${results//'%'/'%25'}"
results="${results//$'\n'/'%0A'}"
results="${results//$'\r'/'%0D'}"
if [[ "${warns[@]}" =~ "${file}" ]]; then
echo "::warning file=${f}::$results"
else
echo "::error file=${f}::$results"
exit=1
fi
fi
done
if [ "${exit}" == "1" ]; then
echo "Compliance error, check for error messages in the \"Run Compliance Tests\" step"
echo "You can run this step locally with the ./scripts/ci/check_compliance.py script."
exit 1;
fi

View File

@@ -1,48 +0,0 @@
# Copyright (c) 2020 Intel Corp.
# SPDX-License-Identifier: Apache-2.0
name: Publish commit for daily testing
on:
schedule:
- cron: '50 22 * * *'
push:
branches:
- refs/tags/*
permissions:
contents: read
jobs:
get_version:
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .
aws s3 cp versions.json s3://testing.zephyrproject.org/daily_tests/versions.json

View File

@@ -1,52 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2020 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Devicetree script tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
permissions:
contents: read
jobs:
devicetree-checks:
name: Devicetree script tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: run tox
working-directory: scripts/dts/python-devicetree
run: |
tox

View File

@@ -1,279 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Documentation Build
on:
push:
branches:
- main
tags:
- v*
pull_request:
permissions:
contents: read
env:
DOXYGEN_VERSION: 1.15.0
DOXYGEN_SHA256SUM: 0ec2e5b2c3cd82b7106d19cb42d8466450730b8cb7a9e85af712be38bf4523a1
JOB_COUNT: 8
jobs:
doc-file-check:
name: Check for doc changes
runs-on: ubuntu-24.04
outputs:
file_check: ${{ steps.check-doc-files.outputs.any_modified }}
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Check if Documentation related files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-doc-files
with:
files: |
doc/
boards/**/doc/
**.rst
include/
kernel/include/kernel_arch_interface.h
lib/libc/**
subsys/testsuite/ztest/include/**
**/Kconfig*
west.yml
scripts/dts/
doc/requirements.txt
.github/workflows/doc-build.yml
scripts/pylib/pytest-twister-harness/src/twister_harness/device/device_adapter.py
scripts/pylib/pytest-twister-harness/src/twister_harness/helpers/shell.py
doc-build-html:
name: "Documentation Build (HTML)"
needs: [doc-file-check]
if: >
needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
timeout-minutes: 60
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
env:
BASE_REF: ${{ github.base_ref }}
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages required for documentation build
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip install -r doc/requirements.txt --require-hashes
- name: Build HTML documentation
shell: bash
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
DOC_TARGET="html-fast"
else
DOC_TARGET="html"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-j ${JOB_COUNT} -W --keep-going -T" \
SPHINXOPTS_EXTRA="-q -t publish" \
make -C doc ${DOC_TARGET}
# API documentation coverage
python3 -m coverxygen --xml-dir doc/_build/html/doxygen/xml/ --src-dir include/ --output doc-coverage.info
# deprecated page causing issues
lcov --remove doc-coverage.info \*/deprecated > new.info
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: Compress documentation build artifacts
run: |
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --exclude html/_sources --exclude html/doxygen/xml --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf api-output.tar.xz --directory=doc/_build html/doxygen/html
tar --use-compress-program="xz -T0" -cf api-coverage.tar.xz coverage-report
- name: Upload HTML output
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: html-output
path: html-output.tar.xz
- name: Upload Doxygen coverage artifacts
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: api-coverage
path: api-coverage.tar.xz
- name: Summarize PR documentation URLs
if: github.event_name == 'pull_request'
run: |
REPO_NAME="${{ github.event.repository.name }}"
PR_NUM="${{ github.event.pull_request.number }}"
DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/"
API_DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/doxygen/html/"
API_COVERAGE_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/api-coverage/"
echo "${PR_NUM}" > pr_num
echo "Documentation will be available shortly at: ${DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Documentation will be available shortly at: ${API_DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: Upload PR number
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
if: github.event_name == 'pull_request'
with:
name: pr_num
path: pr_num
doc-build-pdf:
name: "Documentation Build (PDF)"
needs: [doc-file-check]
if: |
github.event_name != 'pull_request'
runs-on: ubuntu-24.04
timeout-minutes: 120
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: doc/requirements.txt
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends graphviz librsvg2-bin \
texlive-latex-base texlive-latex-extra latexmk \
texlive-fonts-recommended texlive-fonts-extra texlive-xetex \
imagemagick fonts-noto xindy
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
echo "${DOXYGEN_SHA256SUM} doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz" | sha256sum -c
if [ $? -ne 0 ]; then
echo "Failed to verify doxygen tarball"
exit 1
fi
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
enable-ccache: false
- name: install-pip-pkgs
working-directory: zephyr
run: |
pip install -r doc/requirements.txt --require-hashes
- name: build-docs
shell: bash
working-directory: zephyr
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-q -j ${JOB_COUNT}" \
LATEXMKOPTS="-quiet -halt-on-error" \
make -C doc pdf
- name: upload-build
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: pdf-output
if-no-files-found: ignore
path: |
zephyr/doc/_build/latex/zephyr.pdf
zephyr/doc/_build/latex/zephyr.log
doc-build-status-check:
if: always()
name: "Documentation Build Status"
needs:
- doc-build-pdf
- doc-file-check
- doc-build-html
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

View File

@@ -1,87 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish (Pull Request)
on:
workflow_run:
workflows: ["Documentation Build"]
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Load PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
let fs = require("fs");
let pr_number = Number(fs.readFileSync("./pr_num/pr_num"));
core.exportVariable("PR_NUM", pr_number);
- name: Check PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
id: check-pr
uses: carpentries/actions/check-valid-pr@2e20fd5ee53b691e27455ce7ca3b16ea885140e8 # v0.15.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
- name: Validate PR number
if: |
steps.download-artifacts.outputs.found_artifact == 'true' &&
steps.check-pr.outputs.VALID != 'true'
run: |
echo "ABORT: PR number validation failed!"
exit 1
- name: Uncompress HTML docs
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
if: steps.download-artifacts.outputs.found_artifact == 'true'
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
aws s3 sync --quiet html-output/html \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/docs \
--delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/api-coverage \
--delete
fi

View File

@@ -1,64 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish
on:
workflow_run:
workflows: ["Documentation Build"]
branches:
- main
- v*
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Uncompress HTML docs
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
if [ "${HEAD_BRANCH:0:1}" == "v" ]; then
VERSION=${HEAD_BRANCH:1}
else
VERSION="latest"
fi
aws s3 sync --quiet html-output/html s3://docs.zephyrproject.org/${VERSION} --delete
aws s3 sync --quiet html-output/html/doxygen/html s3://docs.zephyrproject.org/apidoc/${VERSION} --delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ s3://docs.zephyrproject.org/api-coverage/${VERSION} --delete
fi
aws s3 cp --quiet pdf-output/zephyr.pdf s3://docs.zephyrproject.org/${VERSION}/zephyr.pdf

View File

@@ -1,36 +0,0 @@
name: Error numbers
on:
pull_request:
paths:
- '.github/workflows/errno.yml'
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
- 'SDK_VERSION'
permissions:
contents: read
jobs:
check-errno:
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
west-group-filter: -hal,-tools,-bootloader,-babblesim
west-project-filter: -nrf_hw_models
enable-ccache: false
- name: Run errno.py
working-directory: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
export ZEPHYR_BASE=${PWD}
./scripts/ci/errno.py

View File

@@ -1,138 +0,0 @@
name: Footprint Tracking
on:
schedule:
- cron: '50 00 * * *'
push:
paths:
- 'VERSION'
- '.github/workflows/footprint-tracking.yml'
branches:
- main
- v*-branch
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
footprint-tracking:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
defaults:
run:
shell: bash
strategy:
fail-fast: false
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install packages
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: west setup
run: |
west init -l . || true
west config --global update.narrow true
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Record Footprint
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=${PWD}
./scripts/footprint/track.py -p scripts/footprint/plan.txt
- name: Upload footprint data
run: |
python3 -m venv .venv
. .venv/bin/activate
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/
- name: Transform Footprint data to Twister JSON reports
run: |
shopt -s globstar
export ZEPHYR_BASE=${PWD}
python3 ./scripts/footprint/pack_as_twister.py -vvv \
--plan ./scripts/footprint/plan.txt \
--test-name='name.feature' \
./footprint_data/**/
- name: Upload to ElasticSearch
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
ELASTICSEARCH_INDEX: ${{ vars.FOOTPRINT_TRACKING_INDEX }}
run: |
shopt -s globstar
run_date=`date --iso-8601=minutes`
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--flatten footprint \
--flatten-list-names "{'children':'name'}" \
--transform "{ 'footprint_name': '^(?P<footprint_area>([^\/]+\/){0,2})(?P<footprint_path>([^\/]*\/)*)(?P<footprint_symbol>[^\/]*)$' }" \
--run-id "${{ github.run_id }}" \
--run-attempt "${{ github.run_attempt }}" \
--run-workflow "footprint-tracking:${{ github.event_name }}" \
--run-branch "${{ github.ref_name }}" \
-i ${ELASTICSEARCH_INDEX} \
./footprint_data/**/twister_footprint.json
#

View File

@@ -1,64 +0,0 @@
name: Greet first time contributor
on:
issues:
types: [opened]
pull_request_target:
types: [opened, closed]
permissions:
contents: read
jobs:
check_for_first_interaction:
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on pull requests
issues: write # to comment on issues
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: zephyrproject-rtos/action-first-interaction@58853996b1ac504b8e0f6964301f369d2bb22e5c # v1.1.1+zephyr.6
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-message: >
Hi @${{github.event.issue.user.login}}! We appreciate you submitting your first issue
for our open-source project. 🌟
Even though I'm a bot, I can assure you that the whole community is genuinely grateful
for your time and effort. 🤖💙
pr-opened-message: >
Hello @${{ github.event.pull_request.user.login }}, and thank you very much for your
first pull request to the Zephyr project!
Our Continuous Integration pipeline will execute a series of checks on your Pull Request
commit messages and code, and you are expected to address any failures by updating the PR.
Please take a look at [our commit message guidelines](https://docs.zephyrproject.org/latest/contribute/guidelines.html#commit-message-guidelines)
to find out how to format your commit messages, and at [our contribution workflow](https://docs.zephyrproject.org/latest/contribute/guidelines.html#contribution-workflow)
to understand how to update your Pull Request.
If you haven't already, please make sure to review the project's [Contributor
Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html)
and update (by amending and force-pushing the commits) your pull request if necessary.
If you are stuck or need help please join us on [Discord](https://chat.zephyrproject.org/)
and ask your question there. Additionally, you can [escalate the review](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html#pr-technical-escalation)
when applicable. 😊
pr-merged-message: >
Hi @${{ github.event.pull_request.user.login }}!
Congratulations on getting your very first Zephyr pull request merged 🎉🥳. This is a
fantastic achievement, and we're thrilled to have you as part of our community!
To celebrate this milestone and showcase your contribution, we'd love to award you the
Zephyr Technical Contributor badge. If you're interested, please claim your badge by
filling out this form: [Claim Your Zephyr Badge](https://forms.gle/oCw9iAPLhUsHTapc8).
Thank you for your valuable input, and we look forward to seeing more of your
contributions in the future! 🪁

View File

@@ -1,88 +0,0 @@
name: Hello World (Multiplatform)
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/**'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04, ubuntu-24.04, ubuntu-24.04-arm, macos-14, windows-2022]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
shell: bash
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: aarch64-zephyr-elf:arc-zephyr-elf:arc64-zephyr-elf:arm-zephyr-eabi:mips-zephyr-elf:riscv64-zephyr-elf:sparc-zephyr-elf:x86_64-zephyr-elf:xtensa-dc233c_zephyr-elf:xtensa-sample_controller32_zephyr-elf:rx-zephyr-elf
ccache-cache-key: hw-${{ matrix.os }}
- name: Build firmware
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
elif [ "${{ runner.os }}-${{ runner.arch }}" == "Linux-ARM64" ]; then
EXTRA_TWISTER_FLAGS="--exclude-platform native_sim/native"
fi
west twister --runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
if-no-files-found: ignore
path:
zephyr/twister-out/*/samples/hello_world/sample.basic.helloworld/build.log
zephyr/twister-out/*/samples/cpp/hello_world/sample.cpp.helloworld/build.log

View File

@@ -1,57 +0,0 @@
name: Issue Tracker
on:
schedule:
- cron: '10 1/4 * * *'
permissions:
contents: read
env:
OUTPUT_FILE_NAME: IssuesReport.md
COMMITTER_EMAIL: actions@github.com
COMMITTER_NAME: github-actions
COMMITTER_USERNAME: github-actions
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download configuration file
run: |
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/main/.github/workflows/issues-report-config.json
- name: install-packages
run: |
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@54c549b7d38b7db39e5c6e06fd9617e12e5c3491 # v4
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
outputPath: ${{ env.OUTPUT_FILE_NAME }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Post Results
run: |
mkd2html IssuesReport.md IssuesReport.html
aws s3 cp --quiet IssuesReport.html s3://testing.zephyrproject.org/issues/$GITHUB_REPOSITORY/index.html

View File

@@ -1,37 +0,0 @@
[
{
"section": "High Priority Bugs",
"labels": ["bug", "priority: high"],
"threshold": 0
},
{
"section": "Medium Priority Bugs",
"labels": ["bug", "priority: medium"],
"threshold": 20
},
{
"section": "Low Priority Bugs",
"labels": ["bug", "priority: low"],
"threshold": 100
},
{
"section": "Enhancements",
"labels": ["Enhancement"],
"threshold": 500
},
{
"section": "Features",
"labels": ["Feature"],
"threshold": 100
},
{
"section": "Questions",
"labels": ["question"],
"threshold": 100
},
{
"section": "Static Analysis",
"labels": ["Coverity"],
"threshold": 100
}
]

View File

@@ -1,37 +0,0 @@
name: Scancode
on: [pull_request]
permissions:
contents: read
jobs:
scancode_job:
runs-on: ubuntu-24.04
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@23ef91ce31cd4b954366a7b71eea47520da9b380 # v4
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: scancode
path: ./artifacts
- name: Verify
run: |
if [ -s ./artifacts/report.txt ]; then
report=$(cat ./artifacts/report.txt)
report="${report//'%'/'%25'}"
report="${report//$'\n'/'%0A'}"
report="${report//$'\r'/'%0D'}"
echo "::error file=./artifacts/report.txt::$report"
exit 1
fi

View File

@@ -1,45 +0,0 @@
name: Manifest
on:
pull_request_target:
permissions:
contents: read
jobs:
contribs:
runs-on: ubuntu-24.04
permissions:
pull-requests: write # to create/update pull request comments
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyrproject/zephyr
fetch-depth: 0
persist-credentials: false
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip install -r scripts/requirements-west.txt --require-hashes
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@09983f53d3d878791aa37a7755ae44d695f4c1e5 # v2.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
check-impostor-commits: 'true'
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM (manifest)'
blobs-added-labels: 'Binary Blobs Added'
blobs-modified-labels: 'Binary Blobs Modified'

View File

@@ -1,19 +0,0 @@
name: Check SHA-pinned GitHub Actions
on:
pull_request:
paths:
- '.github/workflows/**'
permissions:
contents: read
jobs:
check-sha-pinned-actions:
name: Verify GitHub Actions
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Ensure SHA pinned actions
uses: zgosalvez/github-actions-ensure-sha-pinned-actions@9e9574ef04ea69da568d6249bd69539ccc704e74 # v4.0.0

View File

@@ -1,47 +0,0 @@
name: PR Metadata Check
on:
pull_request:
types:
- synchronize
- opened
- reopened
- labeled
- unlabeled
- edited
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
do-not-merge:
name: Prevent Merging
runs-on: ubuntu-24.04
timeout-minutes: 30
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run the check script
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
python -u scripts/ci/do_not_merge.py \
-p "${{ github.event.pull_request.number }}" \
-o "${{ github.event.repository.owner.login }}" \
-r "${{ github.event.repository.name }}"

View File

@@ -1,53 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Misc. Pylib Scripts TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
permissions:
contents: read
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run build_helpers tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/build_helpers

View File

@@ -1,31 +0,0 @@
name: ready to merge
on:
workflow_call:
inputs:
needs_context:
type: string
required: true
permissions:
contents: read
jobs:
all_jobs_passed:
name: all jobs passed
runs-on: ubuntu-latest
steps:
- name: "Check status of all required jobs"
run: |-
NEEDS_CONTEXT='${{ inputs.needs_context }}'
JOB_IDS=$(echo "$NEEDS_CONTEXT" | jq -r 'keys[]')
for JOB_ID in $JOB_IDS; do
RESULT=$(echo "$NEEDS_CONTEXT" | jq -r ".[\"$JOB_ID\"].result")
echo "$JOB_ID job result: $RESULT"
if [[ $RESULT != "success" && $RESULT != "skipped" ]]; then
echo "***"
echo "Error: The $JOB_ID job did not pass."
exit 1
fi
done
echo "All jobs passed or were skipped."

View File

@@ -1,65 +0,0 @@
name: Create a Release
on:
push:
tags:
- 'v*'
- '!v*rc*'
permissions:
contents: read
jobs:
release:
runs-on: ubuntu-24.04
permissions:
contents: write # to create GitHub release entry
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Get the version
id: get_version
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
- name: REUSE Compliance Check
uses: fsfe/reuse-action@676e2d560c9a403aa252096d99fcab3e1132b0f5 # v6.0.0
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: Create empty release notes body
run: |
echo "TODO: add release overview and notes link" > release-notes.txt
- name: Create Release
id: create_release
uses: actions/create-release@0cb9c9b65d5d1901c1f53e5e66eaf4afd303e70e # v1.1.4
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Zephyr ${{ steps.get_version.outputs.TRIMMED_VERSION }}
body_path: release-notes.txt
draft: true
prerelease: true
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@e8f9f06c4b078e705bd2ea027f0926603fc9b4d5 # v1.0.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_content_type: text/plain

View File

@@ -1,61 +0,0 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.
name: Scorecards supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '43 7 * * 6'
push:
branches:
- main
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed for Code scanning upload
security-events: write
# Needed for GitHub OIDC token if publish_results is true
id-token: write
steps:
- name: "Checkout code"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@4eaacf0543bb3f2c246792bd56e8cdeffafb205a # v2.4.3
with:
results_file: results.sarif
results_format: sarif
# Publish results to OpenSSF REST API for easy access by consumers.
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
publish_results: true
# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: SARIF file
path: results.sarif
retention-days: 5
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
sarif_file: results.sarif

View File

@@ -1,70 +0,0 @@
# Copyright 2023 Google LLC
# SPDX-License-Identifier: Apache-2.0
name: Scripts tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
permissions:
contents: read
jobs:
scripts-tests:
name: Scripts tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest
env:
ZEPHYR_BASE: ./
run: |
echo "Run script tests"
pytest ./scripts/build

View File

@@ -1,33 +0,0 @@
name: Stale Workflow Queue Cleanup
on:
workflow_dispatch:
schedule:
# everyday at 15:00
- cron: '0 15 * * *'
permissions:
contents: read
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-24.04
if: github.ref == 'refs/heads/main'
permissions:
actions: write # to delete stale workflow runs
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@78b5af714fefaefdf74862181c467b061782719e # v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day
older-than-seconds: 86400
status: queued
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,35 +0,0 @@
name: "Close stale pull requests/issues"
on:
schedule:
- cron: "16 00 * * *"
permissions:
contents: read
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on stale pull requests
issues: write # to comment on stale issues
steps:
- uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this pull request will automatically be
closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this issue will automatically be closed in
14 days. Note, that you can always re-open a closed issue at any time.'
days-before-stale: 60
days-before-close: 14
stale-issue-label: 'Stale'
stale-pr-label: 'Stale'
exempt-pr-labels: 'Blocked,In progress'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta,Process,Coverity'
operations-per-run: 400

View File

@@ -1,38 +0,0 @@
name: Merged PR stats
on:
pull_request_target:
branches:
- main
- v*-branch
types: [closed]
permissions:
contents: read
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -1,64 +0,0 @@
name: Publish Twister Test Results
on:
workflow_run:
workflows: ["Run tests with twister"]
branches:
- main
types:
- completed
permissions:
contents: read
jobs:
upload-to-elasticsearch:
if: |
github.repository == 'zephyrproject-rtos/zephyr' &&
github.event.workflow_run.event != 'pull_request'
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
runs-on: ubuntu-24.04
steps:
# Needed for elasticearch and upload script
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
path: artifacts
workflow: twister.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Upload to elasticsearch
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event.workflow_run.event}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event.workflow_run.event}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi

View File

@@ -1,402 +0,0 @@
name: Run tests with twister
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 02:00 UTC on every Sunday
- cron: '0 2 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
twister-build-prep:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-24.04
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Checkout
if: github.event_name == 'pull_request'
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
persist-credentials: false
- name: Set up Python
if: github.event_name == 'pull_request'
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: install-packages
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Setup Zephyr project
if: github.event_name == 'pull_request'
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
enable-ccache: false
- name: Environment Setup
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
- name: Generate Test Plan with Twister
working-directory: zephyr
if: github.event_name == 'pull_request'
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
echo "TWISTER_NODES=${MATRIX_SIZE}" >> $GITHUB_ENV
fi
rm -f testplan.json .testplan
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "pull_request" ]; then
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${WEEKLY_MATRIX_SIZE})]"
size=${WEEKLY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
twister-build:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: ${{fromJSON(needs.twister-build-prep.outputs.subset)}}
timeout-minutes: 1440
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resolve the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: ' --test-config tests/test_config_ci.yaml --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
WEEKLY_OPTIONS: ' -M --build-only --all --show-footprint --report-filtered -j 32'
PR_OPTIONS: ' --clobber-output --integration -j 16'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered -j 16'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config manifest.group-filter -- +ci,+optional,+testing
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
gcc --version
cargo --version
rustup target list --installed
ls -la
echo "github.ref: ${{ github.ref }}"
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
west packages pip --install
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
id: run_twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${PUSH_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PUSH_OPTIONS}
fi
fi
- if: github.event_name == 'pull_request'
name: Run Tests with Twister (Pull Request)
id: run_twister_pr
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PR_OPTIONS}
fi
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Weekly)
id: run_twister_sched
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
fi
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
path: |
twister-out/twister.xml
twister-out/twister.json
module_tests/twister.xml
testplan.json
- if: matrix.subset == 1 && github.event_name == 'push'
name: Save the list of Python packages
shell: bash
run: |
FREEZE_FILE="frozen-requirements.txt"
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Frozen PIP package set
path: |
frozen-requirements.txt
twister-test-results:
name: "Publish Unit Tests Results"
needs:
- twister-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create the check run entry with Twister test results
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: artifacts
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results
if-no-files-found: ignore
path: |
junit.html
junit.xml
- name: Upload test results to Codecov
if: ${{ !cancelled() && (github.event_name == 'push') }}
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
with:
token: ${{ secrets.CODECOV_TOKEN }}
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off
- name: Analyze Twister Reports
if: needs.twister-build.result == 'failure'
run: |
./scripts/ci/twister_report_analyzer.py artifacts/*/*/twister.json --long-summary --platforms --output-md errors.md
if [[ -s "errors.md" ]]; then
echo '### Error Summary! 🚀' >> $GITHUB_STEP_SUMMARY
cat errors.md >> $GITHUB_STEP_SUMMARY
fi
- name: Upload Twister Analysis Results
if: needs.twister-build.result == 'failure'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Twister Analysis Results
if-no-files-found: ignore
path: |
twister_report_summary.json
twister-status-check:
if: always()
name: "Check Twister Status"
needs:
- twister-build-prep
- twister-build
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

View File

@@ -1,69 +0,0 @@
# Copyright (c) 2020 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister TestSuite
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
permissions:
contents: read
jobs:
twister-tests:
name: Twister Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for twisterlib
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/twister
- name: Run pytest for pytest-twister-harness
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
PYTHONPATH: ./scripts/pylib/pytest-twister-harness/src:${PYTHONPATH}
run: |
echo "Run twister tests"
pytest ./scripts/pylib/pytest-twister-harness/tests

View File

@@ -1,68 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister BlackBox TestSuite
on:
pull_request:
branches:
- main
- collab-*
- v*-branch
paths:
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
permissions:
contents: read
env:
PYTHONIOENCODING: utf-8
jobs:
twister-tests:
name: Twister Black Box Tests
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04, macos-14, windows-2022]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
fetch-depth: 0
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: all
enable-ccache: false
west-group-filter: -tools,-bootloader,-babblesim,-hal
west-project-filter: -nrf_hw_models,+cmsis,+hal_xtensa,+cmsis_6
- name: Run Pytest For Twister Black Box Tests
if: ${{ runner.os == 'Linux' }}
working-directory: zephyr
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
sudo apt-get install -y lcov
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox/

View File

@@ -1,60 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Zephyr West Command Tests
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
permissions:
contents: read
jobs:
west-commands:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |
./scripts/west_commands/run_tests.py

113
.gitignore vendored
View File

@@ -1,6 +1,3 @@
# SPDX-License-Identifier: Apache-2.0
# SPDX-FileCopyrightText: Copyright The Zephyr Project Contributors
*.o
*.a
*.d
@@ -10,116 +7,24 @@
*.swp
*.swo
*~
# Emacs
\#*\#
build*/
!doc/build/
!scripts/build
!tests/drivers/build_all
!scripts/pylib/build_helpers
cscope.*
.dir
/*.patch
# The .cache directory will be used to cache toolchain capabilities if
# no suitable out-of-tree directory is found.
.cache
outdir
outdir-*
scripts/basic/fixdep
scripts/gen_idt/gen_idt
coverage-report
doc-coverage.info
scripts/gen_offset_header/gen_offset_header
scripts/kconfig/conf
scripts/kconfig/mconf
scripts/kconfig/zconf.hash.c
scripts/kconfig/zconf.lex.c
scripts/kconfig/zconf.tab.c
doc/_build
doc/doxygen
doc/xml
doc/html
doc/boards
doc/samples
doc/latex
doc/themes/zephyr-docs-theme
sanity-out*
twister-out*
bsim_out
bsim_bt_out
myresults.xml
tests/RunResults.xml
sanity-out/
scripts/grub
doc/reference/kconfig/*.rst
doc/doc.warnings
.*project
.settings
.envrc
.vscode
hide-defaults-note
venv
.venv
.envrc
.DS_Store
.clangd
new.info
# Cargo drops lock files in projects to capture resolved dependencies.
# We don't want to record these.
Cargo.lock
# Cargo encourages a .cargo/config.toml file to symlink to a generated file. Don't save these.
.cargo/
# Normal west builds will place the Rust target directory under the build directory. However,
# sometimes IDEs and such will litter these target directories as well.
target/
# CI output
compliance.xml
dts_linter.patch
_error.types
# Tag files
GPATH
GRTAGS
GTAGS
TAGS
doc/reference/kconfig/CONFIG_*
doc/reference/kconfig/index.rst
tags
.idea
# from check_compliance.py
# zephyr-keep-sorted-start
BinaryFiles.txt
BoardYml.txt
Checkpatch.txt
ClangFormat.txt
DevicetreeBindings.txt
DevicetreeLinting.txt
GitDiffCheck.txt
Gitlint.txt
Identity.txt
ImageSize.txt
Kconfig.txt
KconfigBasic.txt
KconfigBasicNoModules.txt
KconfigHWMv2.txt
KeepSorted.txt
LicenseAndCopyrightCheck.txt
MaintainersFormat.txt
ModulesMaintainers.txt
Nits.txt
Pylint.txt
PythonCompat.txt
Ruff.txt
SphinxLint.txt
SysbuildKconfig.txt
SysbuildKconfigBasic.txt
SysbuildKconfigBasicNoModules.txt
TextEncoding.txt
YAMLLint.txt
ZephyrModuleFile.txt
# zephyr-keep-sorted-stop
# Node dependecies
node_modules

View File

@@ -1,66 +0,0 @@
# All these sections are optional, edit this file as you like.
# Zephyr-specific defaults are located in scripts/gitlint/zephyr_commit_rules.py
[general]
regex-style-search=true
ignore=title-trailing-punctuation, T3, title-max-length, T1, body-hard-tab, B3, B1
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
verbosity = 3
# By default gitlint will ignore merge commits. Set to 'false' to disable.
ignore-merge-commits=false
ignore-revert-commits=false
ignore-fixup-commits=false
ignore-squash-commits=false
# Enable debug mode (prints more output). Disabled by default
debug = false
# Set the extra-path where gitlint will search for user defined rules
# See http://jorisroovers.github.io/gitlint/user_defined_rules for details
extra-path=scripts/gitlint
[title-max-length-no-revert]
# line-length=75
[body-min-line-count]
# min-line-count=1
[body-max-line-count]
# max-line-count=200
[title-starts-with-subsystem]
regex = ^(?!subsys:)(?!treewide:)(([^:]+):)(\s([^:]+):)*\s(.+)$
[title-must-not-contain-word]
# Comma-separated list of words that should not occur in the title. Matching is case
# insensitive. It's fine if the keyword occurs as part of a larger word (so "WIPING"
# will not cause a violation, but "WIP: my title" will.
words=wip
[title-match-regex]
# python like regex (https://docs.python.org/2/library/re.html) that the
# commit-msg title must be matched to.
# Note that the regex can contradict with other rules if not used correctly
# (e.g. title-must-not-contain-word).
#regex=^US[0-9]*
[max-line-length-with-exceptions]
# B1 = body-max-line-length
# line-length=75
[body-min-length]
min-length=3
[body-is-missing]
# Whether to ignore this rule on merge commits (which typically only have a title)
# default = True
ignore-merge-commits=false
[body-changed-file-mention]
# List of files that need to be explicitly mentioned in the body when they are changed
# This is useful for when developers often erroneously edit certain files or git submodules.
# By specifying this rule, developers can only change the file when they explicitly reference
# it in the commit message.
#files=gitlint/rules.py,README.md
[ignore-by-author-name]
regex=^dependabot\[bot\]$
ignore=all

4
.gitreview Normal file
View File

@@ -0,0 +1,4 @@
[gerrit]
host=gerrit.zephyrproject.org
port=29418
project=zephyr.git

55
.known-issues/README Normal file
View File

@@ -0,0 +1,55 @@
This directory contains configuration files to ignore errors found in
the build and test process which are known to the developers and for
now can be safely ignored.
To use:
$ cd zephyr
$ make SOMETHING >& result
$ scripts/filter-known-issues.py result
It is included in the source tree so if anyone has to submit anything
that triggers some kind of error that is a false positive, it can
include the "ignore me" file, properly documented.
Each file can contain one or more multiline Python regular expressions
(https://docs.python.org/2/library/re.html#regular-expression-syntax)
that match an error message. Multiple regular expressions are
separated by comment blocks (that start with #). Note that an empty
line still is considered part of the multiline regular expression.
For example
---beginning---
#
# This testcase always fails, pending fix ZEP-1234
#
.*/tests/kernel/grumpy .* FAIL
#
# Documentation issue, masks:
#
# /home/e/inaky/z/kernel.git/doc/api/io_interfaces.rst:28: WARNING: Invalid definition: Expected identifier in nested name. [error at 19]
# struct dev_config::@65 dev_config::bits
# -------------------^
#
^(?P<filename>.+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^\s+struct dev_config::@[0-9]+ dev_config::bits.*
^\s+-+\^
---end---
Note you want to:
- use relateive paths; instead of
/home/me/mydir/zephyr/something/somewhere.c you will want
^.*/something/somewhere.c (as they will depend on where it is being
built)
- Replace line numbers with [0-9]+, as they will change
- (?P<filename>[-._/\w]+/something/somewhere.c) saves the match on
that file path in a "variable" called 'filename' that later you can
match with (?P=filename) if you want to match multiple lines of the
same error message.
Can get really twisted and interesting in terms of regexps; they are
powerful, so start small :)

View File

@@ -0,0 +1,38 @@
#
# Bluetooth unnamed struct definition
#
# FIXME: all these should match the relative filename
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]$
^[ \t]*$
^[ \t]*\^$
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]$
^.*bt_conn_info.__unnamed__.*$
^[- \t]*\^$
#
# bt_gatt_discover_params unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*bt_gatt_discover_params.__unnamed__.*
^[- \t]*\^
#
# Bluetooth GATT unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/bluetooth.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*bt_gatt_read_params.__unnamed__.*
^[- \t]*\^

View File

@@ -0,0 +1,18 @@
#
# KERNELVERSION not being defined in local builds, kill that warning,
# can ignore it
#
^.*/Kconfig.zephyr:[0-9]+: warning: The symbol KERNELVERSION references the non-existent environment variable KERNELVERSION.*
#
# Documentation generation, early message
#
^cd .* && doxygen doc/doxygen.config
^srctree=.* SRCARCH=\w+ python scripts/genrest/genrest.py .*$
# This cuts the sphinx build line; has to be separate because in the
# middle, we have removed the KERNELVERSION one and a full regex won't match
^sphinx-build -t \w+ -b html -d [-._/\w]+ -q \. .*
#
# Documentation generation, footer message
#
^[ \t]*
^Build finished. The HTML pages are in [-._/\w]+

View File

@@ -0,0 +1,12 @@
#
# Sensor value unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*sensor_value.__unnamed__.*
^[- \t]*\^

View File

@@ -0,0 +1,15 @@
#
# UART unnamed struct definition
#
^(?P<filename>[-._/\w]+/doc/api/io_interfaces.rst):(?P<lineno>[0-9]+): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected identifier in nested name. \[error at [0-9]+]
^[ \t]*
^[ \t]*\^
^(?P=filename):(?P=lineno): WARNING: Invalid definition: Expected end of definition. \[error at [0-9]+]
^.*uart_device_config.__unnamed__.*
^[- \t]*\^

6
.known-issues/make.conf Normal file
View File

@@ -0,0 +1,6 @@
#
# When filtering output of the build process, ignore lines that don't
# provide any information that helps the invoker tell if there was an
# error.
#
^make: (Entering|Leaving) directory .*

View File

@@ -0,0 +1,11 @@
#
# When executing test cases, ignore the following messages as they are
# not to be considered hard errors.
#
# Block line when test case cannot run in the HW due to server or connection issues
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/[^:]+:[^:]+: evaluation blocked(.*)$
#
# Block line when there is an issue with the YKUSH serial connection
#
^BLCK0/[-a-z0-9:]+ (.+)#test @[^/]+/(?P<board>[^:]+):[^:]+: exception: 400: (?P=board): Cannot find YKUSH serial '[A-Z0-9]+'$

View File

@@ -0,0 +1,18 @@
#
# When executing test cases under TCF, ignore the following messages
# as they are not to be considered hard errors.
#
# TCF is run under make for taking advantage of the jobserver; when
# the testcase execution fail, make will complain, which we can
# ignore ('sommersault' was the old name of the target).
#
^/tmp/tcf-[a-zA-Z0-9]+.mk:[0-9]+: recipe for target ('tcf-jobserver-run'|'sommersault') failed$
#
# More of the same
#
^make: \*\*\* \[(tcf-jobserver-run|sommersault)\] Error 1$
#
# TCF's summary line. We don't need to consider it to determine if the
# run failed or passed.
#
^[A-Z]+0/\S+:\s+\S+\s+@\S+: [0-9]+ tests \([0-9]+ passed, [0-9]+ failed, [0-9]+ blocked, [0-9]+ skipped\).*$

View File

@@ -0,0 +1,4 @@
#
# Skip line when test case is eliminated due to filters
#
^SKIP0/\S+\s+\S+: No targets can be used \(all [0-9]+ selected from [0-9]+ available eliminated by testcase filtering\)$

142
.mailmap
View File

@@ -1,124 +1,20 @@
Alexandr Kolosov <rikorsev@gmail.com>
Alexandre d'Alton <alexandre.dalton@intel.com>
Amir Kaplan <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com>
Andrzej Kuroś <andrzej.kuros@nordicsemi.no>
Anthony Smigielski <thebasti0ncode@gmail.com>
Armand Ciejak <armand@riedonetworks.com> <armandciejak@users.noreply.github.com>
Aska Wu <aska.wu@linaro.org>
Bit Pathe <bitpathe@gmail.com>
Bjarki Arge Andreasen <baa@trackunit.com>
Carles Cufi <carles.cufi@nordicsemi.no>
chao an <anchao@xiaomi.com>
Charles E. Youse <charles.youse@intel.com>
Chen Xingyu <hi@xingrz.me>
Christoph Schnetzler <christoph.schnetzler@husqvarnagroup.com>
Christoph Schramm <schramm@makaio.com>
Christopher Friedt <cfriedt@meta.com>
Christopher Friedt <cfriedt@meta.com> <cfriedt@fb.com>
Chuck Jordan <cjordan@synopsys.com>
Chunlin Han <chunlin.han@linaro.org> <chunlin.han@acer.com>
David B. Kinder <david.b.kinder@intel.com>
David Komel <a8961713@gmail.com>
David Leach <david.leach@nxp.com>
Dirk Brandewie <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com>
Enjia Mai <enjia.mai@intel.com>
Enjia Mai <enjia.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com> <Eugeniy.Paltsev@synopsys.com>
Felipe Neves <ryukokki.felipe@gmail.com>
Findlay Feng <i@fengch.me>
Flavio Arieta Netto <flavio@exati.com.br>
Francois Ramu <francois.ramu@st.com>
Gerardo Aceves <gerardo.aceves@intel.com>
Gregory Shue <gregory.shue@legrand.com>
Gregory Shue <gregory.shue@legrand.com> <gregory.shue@legrand.us>
HaiLong Yang <cameledyang@pm.me>
James Johnson <james.johnson672@t-mobile.com>
Jarno Lämsä <jarno.lamsa@nordicsemi.no>
Jędrzej Ciupis <jedrzej.ciupis@nordicsemi.no>
Jeremie Garcia <jeremie.garcia@intel.com>
Jim Benjamin Luther <jilu@oticon.com>
Johan Kruger <johan.kruger@windriver.com>
Johann Fischer <j.fischer@phytec.de>
Jørgen Kvalvaag <jorgen.kvalvaag@nordicsemi.no>
Juan Manuel Cruz Alcaraz <juan.m.cruz.alcaraz@intel.com>
Juan Solano <juanx.solano.menacho@intel.com>
Julien D'Ascenzio <julien.dascenzio@paratronic.fr>
Jun Li <jun.r.li@intel.com>
Justin Watson <jwatson5@gmail.com>
Kamil Sroka <kamil.sroka@nordicsemi.no>
Katarzyna Giądła <katarzyna.giadla@nordicsemi.no>
Keren Siman-Tov <keren.siman-tov@intel.com>
Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com>
Dirk Brandewie <dirk.j.brandewie@intel.com> <dirk.j.brandewie@intel.com>
Mike Hirst <michael.hirst@windriver.com> <michael.hirst@windriver.com>
Johan Kruger <johan.kruger@windriver.com> <johan.kruger@windriver.com>
Leona Cook <leonax.cook@intel.com> <lsc@hackeress.com>
Lixin Guo <lixinx.guo@intel.com>
Łukasz Mazur <lukasz.mazur@hidglobal.com>
Manuel Argüelles <manuel.arguelles@nxp.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <manuel.arguelles@coredumplabs.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <marguelles.dev@gmail.com>
Marc Herbert <marc.herbert@intel.com> <46978960+marc-hb@users.noreply.github.com>
Marin Jurjević <marin.jurjevic@hotmail.com>
Mariusz Ryndzionek <mariusz.ryndzionek@firmwave.com>
Mariusz Skamra <mariusz.skamra@codecoup.pl>
Mariusz Skamra <mariusz.skamra@codecoup.pl> <mariusz.skamra@tieto.com>
Martí Bolívar <marti.bolivar@nordicsemi.no>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.bolivar@linaro.org>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.f.bolivar@gmail.com>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@foundries.io>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@opensourcefoundries.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Mateusz Hołenko <mholenko@antmicro.com>
Michael Rosen <michael.r.rosen@intel.com>
Michal Narajowski <michal.narajowski@codecoup.pl>
Mike Hirst <michael.hirst@windriver.com>
Ming Shao <ming.shao@intel.com>
Mohan Kumar Kumar <mohankm@fb.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com>
Navin Sankar Velliangiri <navin@linumiz.com>
NingX Zhao <ningx.zhao@intel.com>
Nishikant Nayak <nishikantax.nayak@intel.com>
Ole Sæther <ole.saether@nordicsemi.no>
Pavel Král <pavel.kral@omsquare.com>
Pavel Vasilyev <pavel.vasilyev@nordicsemi.no>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com>
Paweł Czarnecki <pczarnecki@antmicro.com> <pczarnecki@internships.antmicro.com>
Paweł Kwiek <pawel.kwiek@nordicsemi.no>
Peng Chen <peng1.chen@intel.com>
Peter Bigot <peter.bigot@nordicsemi.no>
Peter Bigot <peter.bigot@nordicsemi.no> <pab@pabigot.com>
Peter Johanson <peter@peterjohanson.com>
Piyush Itankar <piyush.t.itankar@intel.com>
Radosław Koppel <radoslaw.koppel@nordicsemi.no>
Radosław Koppel <radoslaw.koppel@nordicsemi.no> <r.koppel@k-el.com>
Raja D. Singh <rdsingh@iotwizards.com>
Ricardo Salveti <ricardo@opensourcefoundries.com>
Ricardo Salveti <ricardo@opensourcefoundries.com> <ricardo.salveti@linaro.org>
Ruud Derwig <Ruud.Derwig@synopsys.com>
Saku Rautio <saku.rautio@nordicsemi.no>
Scott Worley <scott.worley@microchip.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>
Sharron Liu <sharron.liu@intel.com>
Shilpashree L C <shilpashree.lc@intel.com>
Shuang He <shuang.he@intel.com>
Sigvart Hovland <sigvart.hovland@nordicsemi.no>
Stéphane D'Alu <sdalu@sdalu.com>
Stine Åkredalen <stine.akredalen@nordicsemi.no>
Thomas Heeley <thomas.heeley@intel.com>
Tim Sørensen <tims@demant.com>
Tim Sørensen <tims@demant.com> <tims@oticon.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa@gmail.com>
Xiaorui Hu <xiaorui.hu@linaro.org>
Yannis Damigos <giannis.damigos@gmail.com> <ydamigos@iccs.gr>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com> <yonattan.a.louise.mendoza@linux.intel.com>
YouhuaX Zhu <youhuax.zhu@intel.com>
Leona Cook <leonax.cook@intel.com> <leonax.cook@intel.com>
Thomas Heeley <thomas.heeley@intel.com> <thomas.heeley@intel.com>
Shuang He <shuang.he@intel.com> <shuang.he@intel.com>
Chuck Jordan <cjordan@synopsys.com> <cjordan@synopsys.com>
Jeremie Garcia <jeremie.garcia@intel.com> <jeremie.garcia@intel.com>
Bit Pathe <bitpathe@gmail.com> <bitpathe@gmail.com>
Carles Cufi <carles.cufi@nordicsemi.no> <carles.cufi@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com> <kuo-lang.tseng@intel.com>
Gerardo Aceves <gerardo.aceves@intel.com> <gerardo.aceves@intel.com>
Evan Couzens <evanx.couzens@intel.com> <evanx.couzens@intel.com>
Lei Liu <lei.a.liu@intel.com> <lei.a.liu@intel.com>
Douglas Su <d0u9.su@outlook.com> <d0u9.su@outlook.com>
Keren Siman-Tov <keren.siman-tov@intel.com> <keren.siman-tov@intel.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com> <tulasi.r@tcs.com>
Felipe Neves <ryukokki.felipe@gmail.com> <ryukokki.felipe@gmail.com>
Amir Kaplan <amir.kaplan@intel.com> <amir.kaplan@intel.com>

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +0,0 @@
# Copyright (c) 2024 Basalte bv
# SPDX-License-Identifier: Apache-2.0
extend = ".ruff-excludes.toml"
line-length = 100
target-version = "py310"
[lint]
select = [
# zephyr-keep-sorted-start
"B", # flake8-bugbear
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"SIM", # flake8-simplify
"UP", # pyupgrade
"W", # pycodestyle warnings
# zephyr-keep-sorted-stop
]
ignore = [
# zephyr-keep-sorted-start
"SIM108", # Allow if-else blocks instead of forcing ternary operator
# zephyr-keep-sorted-stop
]
[format]
quote-style = "preserve"
line-ending = "lf"

View File

@@ -1,16 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
extends: default
rules:
line-length:
max: 100
comments:
min-spaces-from-content: 1
indentation:
spaces: 2
indent-sequences: consistent
document-start:
present: false
truthy:
check-keys: false

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +0,0 @@
# Instead of the CODEOWNERS file, The Zephyr Project uses a custom format.
# See MAINTAINERS.yml for details.
#
# DO NOT EDIT THIS FILE.

View File

@@ -1,139 +0,0 @@
# Zephyr Project Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, caste, color, religion, or sexual
identity and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the overall
community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or advances of
any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email address,
without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
conduct@zephyrproject.org. Reports will be received by the Chair of the Zephyr
Governing Board, the Zephyr Project Director (Linux Foundation), and the Zephyr
Project Developer Advocate (Linux Foundation). You may refer to the [Governing
Board](https://zephyrproject.org/governing-board/) and [Linux Foundation
Staff](https://zephyrproject.org/staff/) web pages to identify who are the
individuals currently holding these positions.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series of
actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or permanent
ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within the
community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.1, available at
[https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1].
The only changes made by The Zephyr Project to the original document were to
make explicit who the recipients of Code of Conduct incident reports are.
Community Impact Guidelines were inspired by
[Mozilla's code of conduct enforcement ladder][Mozilla CoC].
For answers to common questions about this code of conduct, see the FAQ at
[https://www.contributor-covenant.org/faq][FAQ]. Translations are available at
[https://www.contributor-covenant.org/translations][translations].
[homepage]: https://www.contributor-covenant.org
[v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html
[Mozilla CoC]: https://github.com/mozilla/diversity
[FAQ]: https://www.contributor-covenant.org/faq
[translations]: https://www.contributor-covenant.org/translations

View File

@@ -1,39 +0,0 @@
Contribution Guidelines
#######################
As an open-source project, we welcome and encourage the community to submit
patches directly to the project. In our collaborative open source environment,
standards and methods for submitting changes help reduce the chaos that can result
from an active development community.
This document briefly summarizes the full `Contribution
Guidelines <http://docs.zephyrproject.org/latest/contribute/index.html>`_
documentation.
* Zephyr uses the permissive open source `Apache 2.0 license`_
that allows you to freely use, modify, distribute and sell your own products
that include Apache 2.0 licensed software.
* There are some imported or reused components of the Zephyr project that
use other licensing and are clearly identified.
* The Developer Certificate of Origin (DCO) process is followed to
ensure developers are following licensing criteria for their
contributions, and documented with a ``Signed-off-by`` line in commits.
* Zephyr development workflow is supported on Linux, macOS, and Windows,
(with a few exceptions).
* Source code for the project is maintained in the GitHub repo:
https://github.com/zephyrproject-rtos/zephyr
* Issue and feature tracking is done using GitHub issues in this repo.
* A Continuous Integration (CI) system runs on every Pull Request (PR)
to verify several aspects of the PR including Git commit formatting,
Coding Style, sanity checks builds, and documentation builds.
* The `Zephyr devel mailing list`_ is a great place to engage with the
community, ask questions, discuss issues, and help each other.
.. _Zephyr devel mailing list: https://lists.zephyrproject.org/g/devel

123
Kbuild Normal file
View File

@@ -0,0 +1,123 @@
# vim: filetype=make
ifneq ("$(wildcard $(MDEF_FILE))","")
MDEF_FILE_PATH=$(strip $(MDEF_FILE))
else
ifneq ($(MDEF_FILE),)
MDEF_FILE_PATH=$(strip $(PROJECT_BASE)/$(MDEF_FILE))
endif
endif
ifeq (${CONFIG_NUM_COMMAND_PACKETS},)
CONFIG_NUM_COMMAND_PACKETS=0
endif
ifeq (${CONFIG_NUM_TIMER_PACKETS},)
CONFIG_NUM_TIMER_PACKETS=0
endif
ifeq (${CONFIG_NUM_TASK_PRIORITIES},)
CONFIG_NUM_TASK_PRIORITIES=$(CONFIG_NUM_PREEMPT_PRIORITIES)
endif
ifeq ($(ARCH),x86)
TASKGROUP_SSE=" TASKGROUP SSE"
endif
define filechk_prj.mdef
(echo "% WARNING. THIS FILE IS AUTO-GENERATED. DO NOT MODIFY!"; \
echo; \
echo "% CONFIG NUM_COMMAND_PACKETS NUM_TIMER_PACKETS NUM_TASK_PRIORITIES"; \
echo "% ============================================================="; \
echo " CONFIG ${CONFIG_NUM_COMMAND_PACKETS} ${CONFIG_NUM_TIMER_PACKETS} ${CONFIG_NUM_TASK_PRIORITIES}"; \
echo; \
echo "% TASKGROUP NAME";\
echo "% ==============";\
echo " TASKGROUP EXE";\
echo " TASKGROUP SYS";\
echo " TASKGROUP FPU_LEGACY";\
echo $(TASKGROUP_SSE);\
echo; \
if test -e "$(MDEF_FILE_PATH)"; then \
cat $(MDEF_FILE_PATH); \
fi;)
endef
misc/generated/sysgen/prj.mdef: $(MDEF_FILE_PATH) \
include/config/auto.conf FORCE
$(call filechk,prj.mdef)
sysgen_cmd=$(strip \
$(PYTHON) $(srctree)/scripts/sysgen \
-i $(CURDIR)/misc/generated/sysgen/prj.mdef \
-o $(CURDIR)/misc/generated/sysgen/ \
)
misc/generated/sysgen/kernel_main.c: misc/generated/sysgen/prj.mdef \
$(srctree)/scripts/sysgen
$(Q)$(sysgen_cmd)
define filechk_configs.c
(echo "/* file is auto-generated, do not modify ! */"; \
echo; \
echo "#include <toolchain.h>"; \
echo; \
echo "GEN_ABS_SYM_BEGIN (_ConfigAbsSyms)"; \
echo; \
cat $(CURDIR)/include/generated/autoconf.h | sed \
's/".*"/1/' | awk \
'/#define/{printf "GEN_ABSOLUTE_SYM(%s, %s);\n", $$2, $$3}'; \
echo; \
echo "GEN_ABS_SYM_END";)
endef
misc/generated/configs.c: include/config/auto.conf FORCE
$(call filechk,configs.c)
targets := misc/generated/configs.c
targets += include/generated/offsets.h
always := misc/generated/configs.c
always += include/generated/offsets.h
ifeq ($(CONFIG_MDEF),y)
targets += misc/generated/sysgen/kernel_main.c
always += misc/generated/sysgen/kernel_main.c
endif
define rule_cc_o_c_1
$(call echo-cmd,cc_o_c_1) $(cmd_cc_o_c_1);
endef
OFFSETS_INCLUDE = $(strip \
-include $(CURDIR)/include/generated/autoconf.h \
-I $(srctree)/include \
-I $(CURDIR)/include/generated \
-I $(srctree)/kernel/unified/include \
$(OFFSETS_INCLUDE_KERNEL_LOCATION) \
-I $(srctree)/lib/libc/minimal/include \
-I $(srctree)/arch/${ARCH}/include )
cmd_cc_o_c_1 = $(CC) $(KBUILD_CFLAGS) $(OFFSETS_INCLUDE) -c -o $@ $<
arch/$(ARCH)/core/offsets/offsets.o: arch/$(ARCH)/core/offsets/offsets.c $(KCONFIG_CONFIG)
$(Q)mkdir -p $(dir $@)
$(call if_changed,cc_o_c_1)
define offsetchk
$(Q)set -e; \
$(kecho) ' CHK $@'; \
mkdir -p $(dir $@); \
$(GENOFFSET_H) -i $(1) -o $@.tmp; \
if [ -r $@ ] && cmp -s $@ $@.tmp; then \
rm -f $@.tmp; \
else \
$(kecho) ' UPD $@'; \
mv -f $@.tmp $@; \
fi
endef
include/generated/offsets.h: arch/$(ARCH)/core/offsets/offsets.o \
include/config/auto.conf FORCE
$(call offsetchk,arch/$(ARCH)/core/offsets/offsets.o)

18
Kconfig
View File

@@ -1,8 +1,20 @@
# General configuration options
# Kconfig - general configuration options
#
# Copyright (c) 2014-2015 Wind River Systems, Inc.
# SPDX-License-Identifier: Apache-2.0
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
mainmenu "Zephyr Kernel Configuration"
source "Kconfig.zephyr"

View File

@@ -1,19 +0,0 @@
# Constant variables to be used across Kconfig options
# Copyright (c) 2024 basalte bv
# SPDX-License-Identifier: Apache-2.0
INT8_MIN := -128
INT16_MIN := -32768
INT32_MIN := -2147483648
INT64_MIN := -9223372036854775808
INT8_MAX := 127
INT16_MAX := 32767
INT32_MAX := 2147483647
INT64_MAX := 9223372036854775807
UINT8_MAX := 255
UINT16_MAX := 65535
UINT32_MAX := 4294967295
UINT64_MAX := 18446744073709551615

File diff suppressed because it is too large Load Diff

View File

@@ -1,202 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -1,10 +0,0 @@
> [!IMPORTANT]
> The license files in this directory are maintained as per the recommendation
> of the REUSE specification (https://reuse.software/spec-3.3).
> These are licenses that may apply to some components hosted in the source tree
> but that do not end up in built binaries
> (see https://docs.zephyrproject.org/latest/LICENSING.html#zephyr-licensing).
>
> These files do _not_ define the license of the Zephyr project itself.
> Zephyr is licensed under the Apache 2.0 license, as specified in
> the [LICENSE](/LICENSE) file at the root of the repository.

66
LICENSING.rst Normal file
View File

@@ -0,0 +1,66 @@
.. _zephyr_licensing:
Licensing of Zephyr Project components
######################################
The Zephyr kernel tree imports or reuses packages, scripts and other
files that are not covered by the :download:`Apache License
<../LICENSE>`. In some places there is no LICENSE file or way to put
a LICENSE file there, so we describe the licensing in this document.
- *kconfig* and *kbuild*
*Origin:* Linux Kernel
*Licensing:* *GPLv2*
- *scripts/{checkpatch.pl,checkstack.pl,get_maintainers.pl,spelling.txt}*
*Origin:* Linux Kernel
*Licensing:* *GPLv2*
- *ext/fs/fat/*
*Origin:* FatFs is a file system based on the FAT file system specification. This is
provided by ELM Chan http://elm-chan.org/fsw/ff/00index_e.html
*Licensing*:
Copyright (C) 2016, ChaN, all right reserved.
FatFs module is an open source software. Redistribution and use of FatFs in
source and binary forms, with or without modification, are permitted provided
that the following condition is met:
1. Redistributions of source code must retain the above copyright notice,
this condition and the following disclaimer.
This software is provided by the copyright holder and contributors "AS IS"
and any warranties related to this software are DISCLAIMED.
The copyright owner or contributors be NOT LIABLE for any damages caused
by use of this software.
- *ext/hal/cmsis/*
*Origin:* https://github.com/ARM-software/CMSIS.git
*Licensing*: :download:`CMSIS_END_USER_LICENCE_AGREEMENT <../ext/hal/cmsis/CMSIS_END_USER_LICENCE_AGREEMENT.pdf>`
- *ext/hal/ksdk/*
*Origin:* http://kex.nxp.com
*Licensing*: 3-clause BSD (see :download:`source
<../ext/hal/ksdk/drivers/fsl_rtc.h>`)
- *ext/hal/nordic/*
*Origin:*
*Licensing*: 3-clause BSD (see :download:`source <../ext/hal/nordic/mdk/nrf51.h>`)
- *ext/hal/qmsi/*
*Origin:* https://github.com/quark-mcu/qmsi/releases
*Licensing*: 3-clause BSD (see :download:`source <../ext/hal/qmsi/include/qm_common.h>`)

451
MAINTAINERS Normal file
View File

@@ -0,0 +1,451 @@
Originally from the Linux Kernel.
# Licensed under the terms of the GNU GPL License version 2
Descriptions of section entries:
P: Person (obsolete)
M: Mail patches to: FullName <address@domain>
R: Designated reviewer: FullName <address@domain>
These reviewers should be CCed on patches.
L: Mailing list that is relevant to this area
W: Web-page with status/info
Q: Patchwork web based patch tracking system site
T: SCM tree type and location.
Type is one of: git, hg, quilt, stgit, topgit
S: Status, one of the following:
Supported: Someone is actually paid to look after this.
Maintained: Someone actually looks after it.
Odd Fixes: It has a maintainer but they don't have time to do
much other than throw the odd patch in. See below..
Orphan: No current maintainer [but maybe you could take the
role as you write your new code].
Obsolete: Old code. Something tagged obsolete generally means
it has been replaced by a better system and you
should be using that.
F: Files and directories with wildcard patterns.
A trailing slash includes all files and subdirectory files.
F: drivers/net/ all files in and below drivers/net
F: drivers/net/* all files in drivers/net, but not below
F: */net/* all files in "any top level directory"/net
One pattern per line. Multiple F: lines acceptable.
N: Files and directories with regex patterns.
N: [^a-z]tegra all files whose path contains the word tegra
One pattern per line. Multiple N: lines acceptable.
scripts/get_maintainer.pl has different behavior for files that
match F: pattern and matches of N: patterns. By default,
get_maintainer will not look at git log history when an F: pattern
match occurs. When an N: match occurs, git log history is used
to also notify the people that have git commit signatures.
X: Files and directories that are NOT maintained, same rules as F:
Files exclusions are tested before file matches.
Can be useful for excluding a specific subdirectory, for instance:
F: net/
X: net/ipv6/
matches all files in and below net excluding net/ipv6/
K: Keyword perl extended regex pattern to match content in a
patch or file. For instance:
K: of_get_profile
matches patches or files that contain "of_get_profile"
K: \b(printk|pr_(info|err))\b
matches patches or files that contain one or more of the words
printk, pr_info or pr_err
One regex pattern per line. Multiple K: lines acceptable.
Note: For the hard of thinking, this list is meant to remain in alphabetical
order. If you could add yourselves to it in alphabetical order that would be
so much easier [Ed]
Maintainers List (try to look for most precise areas first)
-----------------------------------
ARC ARCHITECTURE
M: Ruud Derwig <Ruud.Derwig@synopsys.com>
M: Chuck Jordan <Chuck.Jordan@synopsys.com>
M: Benjamin Walsh <benjamin.walsh@windriver.com>
S: Supported
F: arch/arc/
F: include/arch/arc/
F: boards/arc/
ARM ARCHITECTURE
M: Maureen Helm <maureen.helm@nxp.com>
M: Kumar Gala <kumar.gala@linaro.org>
S: Supported
F: arch/arm/
F: include/arch/arm/
F: boards/arm/
ARM CORTEX MICROCONTROLLER SOFTWARE INTERFACE STANDARD (CMSIS)
M: Maureen Helm <maureen.helm@nxp.com>
M: Kumar Gala <kumar.gala@linaro.org>
S: Supported
F: ext/hal/cmsis/
BOARDS/ARC - ARDUINO 101 SSS
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/arc/arduino_101_sss/
BOARDS/ARC - EM Starterkit
M: Chuck Jordan <Chuck.Jordan@synopsys.com>
S: Supported
F: boards/arc/em_starterkit/
BOARDS/ARC - QUARK SE C1000 SS Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/arc/quark_se_c1000_ss_devboard/
BOARDS/ARM - 96Boards CARBON
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/96b_carbon/
BOARDS/ARM - 96Boards NITROGEN
M: Amit Kucheria <amit.kucheria@linaro.org>
S: Supported
F: boards/arm/96b_nitrogen/
BOARDS/ARM - ARDUINO 101 BLE
M: Johan Hedberg <johan.hedberg@intel.com>
S: Supported
F: boards/arm/arduino_101_ble/
BOARDS/ARM - CC3200 LAUNCHXL
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: boards/arm/cc3200_launchxl/
BOARDS/ARM - NXP FRDM-K64F
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: boards/arm/frdm_k64f/
BOARDS/ARM - NXP Hexiwear
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: boards/arm/hexiwear_k64/
BOARDS/ARM - NORDIC NRF51 REDBEAR BLENANO
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/nrf51_blenano/
BOARDS/ARM - NORDIC NRF52 PCA10040
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: boards/arm/nrf52_pca10040/
BOARDS/ARM - NUCLEO-64 F401RE Devboard
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: boards/arm/nucleo_f401re/
BOARDS/ARM - ARM LTD V2M Beetle
M: Vincenzo Frascino <vincenzo.frascino@linaro.org>
S: Supported
F: boards/arm/v2m_beetle/
BOARDS/NIOS2 - ALTERA MAX10
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: boards/nios2/altera_max10/
BOARDS/X86 - ARDUINO 101
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/arduino_101/
BOARDS/X86 - Galileo
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/galileo/
BOARDS/X86 - QUARK D2000 Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/quark_d2000/
BOARDS/X86 - QUARK SE C1000 Devboard
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: boards/x86/quark_se_c1000/
BLUETOOTH
M: Johan Hedberg <johan.hedberg@intel.com>
M: Luiz Augusto von Dentz <luiz.dentz@gmail.com>
M: Szymon Janc <szymon.janc@gmail.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/bluetooth/bluetooth.html
F: subsys/bluetooth/
F: include/bluetooth/
F: drivers/bluetooth/
F: samples/bluetooth/
F: tests/bluetooth/
F: doc/subsystems/bluetooth/
BLUETOOTH CONTROLLER
M: Vinayak Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: subsys/bluetooth/controller/
CC3200 SDK
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: ext/hal/cc3200sdk/
CC32XX SOC - TI SIMPLELINK
M: Gil Pitney <gil.pitney@linaro.org>
S: Supported
F: arch/arm/soc/ti_simplelink/
DOCUMENTATION
M: Kinder, David <david.b.kinder@intel.com>
M: Perez-Gonzalez, Inaky <inaky.perez-gonzalez@intel.com>
S: Supported
F: doc/
FILE SYSTEM
M: Ramesh Thomas <ramesh.thomas@intel.com>
M: Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
S: Supported
F: ext/fs/
F: subsys/fs/
F: include/fs/
F: include/fs.h
F: samples/fs/
FLASH DRIVER
M: Baohong Liu <baohong.liu@intel.com>
M: Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
S: Supported
F: drivers/flash/
INTERRUPTS
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: drivers/interrupt_controller/
F: arch/arc/core/
F: arch/arm/core/
F: arch/nios2/core/
F: arch/x86/core/
F: include/irq.h
F: include/arch/x86/arch.h
F: include/arch/arm/cortex_m/irq.h
F: include/arch/nios2/arch.h
F: include/arch/arc/arch.h
F: include/arch/arc/v2/irq.h
F: include/drivers/loapic.h
F: include/drivers/ioapic.h
F: include/drivers/mvic.h
KERNEL CORE
M: Benjamin Walsh <benjamin.walsh@windriver.com>
M: Allan Stephens <allan.stephens@windriver.com>
S: Supported
F: kernel/
F: include/nanokernel.h
F: include/microkernel.h
F: include/microkernel/
F: include/misc/
F: include/toolchain/
F: include/atomic.h
F: include/cache.h
F: include/init.h
F: include/irq.h
F: include/irq_offload.h
F: include/kernel_version.h
F: include/linker-defs.h
F: include/linker-tool-gcc.h
F: include/linker-tool.h
F: include/section_tags.h
F: include/sections.h
F: include/shared_irq.h
F: include/sw_isr_table.h
F: include/sys_clock.h
F: include/sys_io.h
F: include/toolchain.h
F: include/zephyr.h
KINETIS SOFTWARE DEVELOPMENT KIT (KSDK)
M: Maureen Helm <maureen.helm@nxp.com>
S: Supported
F: ext/hal/ksdk/
KNOWN ISSUES
M: Anas Nashif <anas.nashif@intel.com>
M: Inaky Perez-Gonzalez <inaky.perez-gonzalez@intel.com>
M: Javier B Perez <javier.b.perez.hernandez@intel.com>
F: .known-issues/
MAINTAINERS
M: Javier B Perez <javier.b.perez.hernandez@intel.com>
M: Anas Nashif <anas.nashif@intel.com>
M: Perez-Gonzalez, Inaky <inaky.perez-gonzalez@intel.com>
S: Supported
F: MAINTAINERS
MBEDTLS
M: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
M: Jithu Joseph <jithu.joseph@intel.com>
M: Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
S: Supported
F: ext/lib/crypto/mbedtls/
F: samples/net/mbedtls_sslclient/
F: tests/crypto/test_mbedtls/
NETWORKING
M: Jukka Rissanen <jukka.rissanen@linux.intel.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/networking/networking.html
F: net/ip/
F: include/net/
F: samples/net/
F: tests/net/
NETWORK APPLICATIONS
M: Flavio Santes <flavio.santes@intel.com>
S: Supported
F: samples/net/dns_client/
F: samples/net/nats_clients/
F: samples/net/paho_mqtt_clients/
NETWORK BUFFERS
M: Johan Hedberg <johan.hedberg@intel.com>
M: Jukka Rissanen <jukka.rissanen@linux.intel.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/networking/buffers.html
F: net/buf.c
F: include/net/buf.h
F: tests/net/buf/
NIOS II
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: arch/nios2/
F: include/arch/nios2/
F: drivers/serial/uart_altera_jtag.c
F: drivers/timer/altera_avalon_timer.c
F: tests/kernel/test_intmath/
F: boards/nios2/
NORDIC MDK
M: Carles Cufi <carles.cufi@nordicsemi.no>
S: Supported
F: ext/hal/nordic/mdk/
POWER MANAGEMENT
M: Ramesh Thomas <ramesh.thomas@intel.com>
M: Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
S: Supported
F: arch/x86/core/crt0.S
F: include/device.h
F: include/init.h
F: include/power.h
F: kernel/microkernel/k_idle.c
F: kernel/nanokernel/device.c
F: samples/power/
QMSI
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: ext/hal/qmsi/
QMSI DRIVERS
M: Sergio Rodriguez <sergio.sf.rodriguez@intel.com>
M: Baohong Liu <baohong.liu@intel.com>
M: Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
S: Supported
F: drivers/*/*qmsi*
F: drivers/*/*/*qmsi*
QUARK D2000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_d2000/
QUARK SE C1000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_se_c1000/
QUARK X1000 SOC
M: Anas Nashif <anas.nashif@intel.com>
S: Supported
F: arch/x86/soc/intel_quark/quark_x1000/
SANITYCHECK
M: Andrew Boie <andrew.p.boie@intel.com>
S: Supported
F: scripts/sanitycheck
F: scripts/expr_parser.py
F: scripts/sanity_chk/
SENSOR DRIVERS
M: Bogdan Davidoaia <bogdan.m.davidoaia@intel.com>
M: Laurentiu Palcu <laurentiu.palcu@intel.com>
M: Murtaza Alexandru <alexandru.murtaza@intel.com>
M: Vlad Dogaru <vlad.dogaru@intel.com>
S: Supported
W: https://www.zephyrproject.org/doc/subsystems/sensor.html
F: include/sensor.h
F: drivers/sensor/
F: samples/sensor/
STM32CUBE SDK
M: Erwan Gouriou <erwan.gouriou@linaro.org>
S: Supported
F: ext/hal/st/stm32cube/
STM32F4X SoC FAMILY and DRIVERS
M: Amit Kucheria <amit.kucheria@linaro.org>
M: Ricardo Salveti <ricardo.salveti@linaro.org>
S: Supported
F: arch/arm/soc/st_stm32/stm32f4/
F: drivers/pinmux/stm32/
F: drivers/gpio/*stm32*
F: drivers/clock_control/*stm32f4*
TINYCRYPT
M: Constanza Heath <constanza.m.heath@intel.com>
M: Flavio Santes <flavio.santes@intel.com>
S: Supported
F: ext/lib/crypto/tinycrypt/
F: tests/crypto/
USB
M: Jithu Joseph <jithu.joseph@intel.com>
S: Supported
F: subsys/usb
F: drivers/usb
F: samples/usb
X86 ARCH
M: Benjamin Walsh <benjamin.walsh@windriver.com>
M: Allan Stephens <allan.stephens@windriver.com>
S: Supported
F: arch/x86/
F: include/arch/x86/
F: boards/x86/
ZOAP
M: Vinicius Costa Gomes <vinicius.gomes@intel.com>
S: Supported
F: lib/iot/zoap/
F: samples/net/zoap_client/
F: samples/net/zoap_server/
F: tests/net/zoap/
THE REST
M: Anas Nashif <anas.nashif@intel.com>
M: Kumar Gala <kumar.gala@linaro.org>
L: devel@lists.zephyrproject.com
T: git https://gerrit.zephyrproject.org/r/a/zephyr
S: Buried alive in reporters
F: *
F: */

File diff suppressed because it is too large Load Diff

1340
Makefile Normal file

File diff suppressed because it is too large Load Diff

149
Makefile.inc Normal file
View File

@@ -0,0 +1,149 @@
# vim: filetype=make
#
UNAME := $(shell uname)
ifeq (MINGW, $(findstring MINGW, $(UNAME)))
DQUOTE = '
# '
PROJECT_BASE ?= $(shell sh -c "pwd -W")
else
DQUOTE = "
# "
PROJECT_BASE ?= $(CURDIR)
endif
ifdef BOARD
KBUILD_DEFCONFIG_PATH=$(wildcard $(ZEPHYR_BASE)/boards/*/*/$(BOARD)_defconfig)
ifeq ($(KBUILD_DEFCONFIG_PATH),)
$(error Board $(BOARD) not found!)
endif
else
$(error BOARD is not defined!)
endif
# Choose a default output directory if one wasn't supplied. Note that
# PRISTINE_O depends on whether this is default or not. If building
# in-tree, we want to remove the whole outdir and not just the BOARD
# specified (thus "pristine"). Out of tree, we can obviously remove
# only what we were told to build.
ifndef O
PRISTINE_O = outdir
O = $(PROJECT_BASE)/outdir/$(BOARD)
else
PRISTINE_O = $(O)
endif
# Turn O into an absolute path; we call the main Kbuild with $(MAKE) -C
# which changes the working directory, relative paths don't work right.
# Need to create the directory first to make realpath happy
ifneq ($(MAKECMDGOALS),help)
$(shell mkdir -p $(O))
override O := $(realpath $(O))
endif
export ARCH MDEF_FILE QEMU_EXTRA_FLAGS PROJECT_BASE
override CONF_FILE := $(strip $(subst $(DQUOTE),,$(CONF_FILE)))
SOURCE_DIR ?= $(PROJECT_BASE)/src/
override SOURCE_DIR := $(realpath $(SOURCE_DIR))
override SOURCE_DIR := $(subst \,/,$(SOURCE_DIR))
override SOURCE_DIR_PARENT := $(patsubst %, %/.., $(SOURCE_DIR))
override SOURCE_DIR_PARENT := $(abspath $(SOURCE_DIR_PARENT))
override SOURCE_DIR_PARENT := $(subst \,/,$(SOURCE_DIR_PARENT))
export SOURCE_DIR SOURCE_DIR_PARENT
ifeq ("$(origin V)", "command line")
KBUILD_VERBOSE = $(V)
endif
ifndef KBUILD_VERBOSE
KBUILD_VERBOSE = 0
endif
ifeq ($(KBUILD_VERBOSE),1)
Q =
S =
else
Q = @
S = -s
endif
export CFLAGS
zephyrmake = +$(MAKE) -C $(ZEPHYR_BASE) O=$(1) \
PROJECT=$(PROJECT_BASE) SOURCE_DIR=$(DQUOTE)$(SOURCE_DIR)$(DQUOTE) $(2)
BOARDCONFIG = $(O)/.board_$(BOARD)
DOTCONFIG = $(O)/.config
all: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
ifeq ($(findstring qemu_,$(BOARD)),)
qemu:
@echo "Emulation not available for this platform"
qemugdb: qemu
else
qemu: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
qemugdb: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
endif
debug: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
flash: $(DOTCONFIG)
$(Q)$(call zephyrmake,$(O),$@)
ifeq ($(MAKECMDGOALS),debugserver)
ARCH = $(notdir $(subst /$(BOARD),,$(wildcard $(ZEPHYR_BASE)/boards/*/$(BOARD))))
-include $(ZEPHYR_BASE)/boards/$(ARCH)/$(BOARD)/Makefile.board
-include $(ZEPHYR_BASE)/scripts/Makefile.toolchain.$(ZEPHYR_GCC_VARIANT)
BOARD_NAME = $(BOARD)
export BOARD_NAME
endif
debugserver: FORCE
$(Q)$(CONFIG_SHELL) $(ZEPHYR_BASE)/scripts/support/$(FLASH_SCRIPT) debugserver
initconfig outputexports: $(DOTCONFIG)
$(BOARDCONFIG):
@rm -f $(O)/.board_*
@touch $@
ram_report: initconfig
$(Q)$(call zephyrmake,$(O),$@)
rom_report: initconfig
$(Q)$(call zephyrmake,$(O),$@)
menuconfig: initconfig
$(Q)$(call zephyrmake,$(O),$@)
help:
$(Q)$(MAKE) -s -C $(ZEPHYR_BASE) $@
# Catch all
%:
$(Q)$(call zephyrmake,$(O),$@)
KERNEL_CONFIG = $(ZEPHYR_BASE)/kernel/configs/unified.config
$(DOTCONFIG): $(BOARDCONFIG) $(KBUILD_DEFCONFIG_PATH) $(CONF_FILE)
$(Q)$(CONFIG_SHELL) $(ZEPHYR_BASE)/scripts/kconfig/merge_config.sh \
-q -m -O $(O) $(KBUILD_DEFCONFIG_PATH) $(KERNEL_CONFIG) $(CONF_FILE) \
$(wildcard $(O)/*.conf)
$(Q)$(MAKE) $(S) -C $(ZEPHYR_BASE) O=$(O) PROJECT=$(PROJECT_BASE) oldnoconfig
pristine:
$(Q)rm -rf $(PRISTINE_O)
PHONY += FORCE initconfig
FORCE:
.PHONY: $(PHONY)

View File

@@ -1,109 +0,0 @@
.. raw:: html
<a href="https://www.zephyrproject.org">
<p align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="doc/_static/images/logo-readme-dark.svg">
<source media="(prefers-color-scheme: light)" srcset="doc/_static/images/logo-readme-light.svg">
<img src="doc/_static/images/logo-readme-light.svg">
</picture>
</p>
</a>
<a href="https://bestpractices.coreinfrastructure.org/projects/74"><img src="https://bestpractices.coreinfrastructure.org/projects/74/badge"></a>
<a href="https://scorecard.dev/viewer/?uri=github.com/zephyrproject-rtos/zephyr"><img src="https://api.securityscorecards.dev/projects/github.com/zephyrproject-rtos/zephyr/badge"></a>
<a href="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml?query=branch%3Amain"><img src="https://github.com/zephyrproject-rtos/zephyr/actions/workflows/twister.yaml/badge.svg?event=push"></a>
The Zephyr Project is a scalable real-time operating system (RTOS) supporting
multiple hardware architectures, optimized for resource constrained devices,
and built with security in mind.
The Zephyr OS is based on a small-footprint kernel designed for use on
resource-constrained systems: from simple embedded environmental sensors and
LED wearables to sophisticated smart watches and IoT wireless gateways.
The Zephyr kernel supports multiple architectures, including ARM (Cortex-A,
Cortex-R, Cortex-M), Intel x86, ARC, Tensilica Xtensa, and RISC-V,
SPARC, MIPS, and a large number of `supported boards`_.
.. below included in doc/introduction/introduction.rst
Getting Started
***************
Welcome to Zephyr! See the `Introduction to Zephyr`_ for a high-level overview,
and the documentation's `Getting Started Guide`_ to start developing.
.. start_include_here
Community Support
*****************
Community support is provided via mailing lists and Discord; see the Resources
below for details.
.. _project-resources:
Resources
*********
Here's a quick summary of resources to help you find your way around:
Getting Started
---------------
| 📖 `Zephyr Documentation`_
| 🚀 `Getting Started Guide`_
| 🙋🏽 `Tips when asking for help`_
| 💻 `Code samples`_
Code and Development
--------------------
| 🌐 `Source Code Repository`_
| 📦 `Releases`_
| 🤝 `Contribution Guide`_
Community and Support
---------------------
| 💬 `Discord Server`_ for real-time community discussions
| 📧 `User mailing list (users@lists.zephyrproject.org)`_
| 📧 `Developer mailing list (devel@lists.zephyrproject.org)`_
| 📬 `Other project mailing lists`_
| 📚 `Project Wiki`_
Issue Tracking and Security
---------------------------
| 🐛 `GitHub Issues`_
| 🔒 `Security documentation`_
| 🛡️ `Security Advisories Repository`_
| ⚠️ Report security vulnerabilities at vulnerabilities@zephyrproject.org
Additional Resources
--------------------
| 🌐 `Zephyr Project Website`_
| 📺 `Zephyr Tech Talks`_
.. _Zephyr Project Website: https://www.zephyrproject.org
.. _Discord Server: https://chat.zephyrproject.org
.. _supported boards: https://docs.zephyrproject.org/latest/boards/index.html
.. _Zephyr Documentation: https://docs.zephyrproject.org
.. _Introduction to Zephyr: https://docs.zephyrproject.org/latest/introduction/index.html
.. _Getting Started Guide: https://docs.zephyrproject.org/latest/develop/getting_started/index.html
.. _Contribution Guide: https://docs.zephyrproject.org/latest/contribute/index.html
.. _Source Code Repository: https://github.com/zephyrproject-rtos/zephyr
.. _GitHub Issues: https://github.com/zephyrproject-rtos/zephyr/issues
.. _Releases: https://github.com/zephyrproject-rtos/zephyr/releases
.. _Project Wiki: https://github.com/zephyrproject-rtos/zephyr/wiki
.. _User mailing list (users@lists.zephyrproject.org): https://lists.zephyrproject.org/g/users
.. _Developer mailing list (devel@lists.zephyrproject.org): https://lists.zephyrproject.org/g/devel
.. _Other project mailing lists: https://lists.zephyrproject.org/g/main/subgroups
.. _Code samples: https://docs.zephyrproject.org/latest/samples/index.html
.. _Security documentation: https://docs.zephyrproject.org/latest/security/index.html
.. _Security Advisories Repository: https://github.com/zephyrproject-rtos/zephyr/security
.. _Tips when asking for help: https://docs.zephyrproject.org/latest/develop/getting_started/index.html#asking-for-help
.. _Zephyr Tech Talks: https://www.zephyrproject.org/tech-talks

View File

@@ -1,25 +0,0 @@
version = 1
# Declare default license and copyright text for files that typically do not include them.
[[annotations]]
path = [
# zephyr-keep-sorted-start
"**/*.cfg",
"**/*.cmake",
"**/*.conf",
"**/*.ecl",
"**/*.html",
"**/*.rst",
"**/*.yaml",
"**/*.yml",
"**/*_defconfig",
"**/CMakeLists.txt",
"**/Kconfig*",
"doc/requirements.in",
"doc/requirements.txt",
"scripts/requirements-*.in",
"scripts/requirements-*.txt",
# zephyr-keep-sorted-stop
]
SPDX-License-Identifier = "Apache-2.0"
SPDX-FileCopyrightText = "Copyright The Zephyr Project Contributors"

View File

@@ -1 +0,0 @@
0.17.4

View File

@@ -1,5 +0,0 @@
VERSION_MAJOR = 4
VERSION_MINOR = 3
PATCHLEVEL = 99
VERSION_TWEAK = 0
EXTRAVERSION =

View File

@@ -1,14 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
# FIXME: SHADOW_VARS: Remove this once we have enabled -Wshadow globally.
add_compile_options($<TARGET_PROPERTY:compiler,warning_shadow_variables>)
add_definitions(-D__ZEPHYR_SUPERVISOR__)
include_directories(
${ZEPHYR_BASE}/kernel/include
${ARCH_DIR}/${ARCH}/include
)
add_subdirectory(common)
add_subdirectory(${ARCH_DIR}/${ARCH} arch/${ARCH})

File diff suppressed because it is too large Load Diff

1
arch/Makefile Normal file
View File

@@ -0,0 +1 @@
obj-y += $(ARCH)/

View File

@@ -1,47 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
# Enable debug support in mdb
# Dwarf version 2 can be recognized by mdb
# The default dwarf version in gdb is not recognized by mdb
zephyr_cc_option(-g3 -gdwarf-2)
# Without this (poorly named) option, compiler may generate undefined
# references to abort().
# See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=63691
zephyr_cc_option(-fno-delete-null-pointer-checks)
zephyr_cc_option_ifdef(CONFIG_ARC_USE_UNALIGNED_MEM_ACCESS -munaligned-access)
if(NOT COMPILER STREQUAL arcmwdt)
if(CONFIG_THREAD_LOCAL_STORAGE)
# Instruct compiler to use proper register as cached thread pointer for thread local storage.
# For ARCv2 the default register is usually not specified - so we need to specify it
# For ARCv3 the register is fixed to r30, so we don't need to specify it
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=26)
else()
# If thread local storage isn't used - we can safely schedule thread pointer register
zephyr_compile_options_ifdef(CONFIG_ISA_ARCV2 -mtp-regno=none)
endif()
endif()
add_subdirectory(core)
if(COMPILER STREQUAL arcmwdt)
add_subdirectory(arcmwdt)
if(CONFIG_64BIT)
zephyr_compile_options(-Ml)
# Instruct MWDT assembler not to warn when we load only lower half (32bit) of symbol
# instead of full 64bit address in ASM code. It is valid as we don't support Zephyr
# linkage to high addresses for 64bit ARC platforms.
zephyr_compile_options(-Wa,-offwarn=168)
endif()
endif()
if(CONFIG_ISA_ARCV3 AND CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf64-littlearc64)
elseif(CONFIG_ISA_ARCV3 AND NOT CONFIG_64BIT)
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc64)
else()
set_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-littlearc)
endif()

6
arch/arc/Kbuild Normal file
View File

@@ -0,0 +1,6 @@
subdir-ccflags-y +=-I$(srctree)/include/drivers
subdir-ccflags-y +=-I$(srctree)/drivers
subdir-asflags-y += $(subdir-ccflags-y)
obj-y += soc/$(SOC_PATH)/
obj-y += core/

View File

@@ -1,7 +1,28 @@
# ARC options
# ARC EM4 options
#
# Copyright (c) 2014 Wind River Systems, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
choice
prompt "ARC SoC Selection"
depends on ARC
source "arch/arc/soc/*/Kconfig.soc"
endchoice
# Copyright (c) 2014, 2019 Wind River Systems, Inc.
# SPDX-License-Identifier: Apache-2.0
menu "ARC Options"
depends on ARC
@@ -9,136 +30,53 @@ menu "ARC Options"
config ARCH
default "arc"
config CPU_ARCEM
config ARCH_DEFCONFIG
string
default "arch/arc/defconfig"
menu "ARC EM4 processor options"
config CPU_ARCEM4
bool
default y
select CPU_ARCV2
select ATOMIC_OPERATIONS_C
help
This option signifies the use of an ARC EM CPU
This option signifies the use of an ARC EM4 CPU
config CPU_ARCHS
endmenu
menu "ARCv2 Family Options"
config CPU_ARCV2
bool
select ATOMIC_OPERATIONS_BUILTIN
select BARRIER_OPERATIONS_BUILTIN
default y
select NANOKERNEL_TICKLESS_IDLE_SUPPORTED
help
This option signifies the use of an ARC HS CPU
This option signifies the use of a CPU of the ARCv2 family.
choice
prompt "ARC Instruction Set"
default ISA_ARCV2
config ISA_ARCV2
bool "ARC ISA v2"
select ARCH_HAS_STACK_PROTECTION if ARC_HAS_STACK_CHECKING || (ARC_MPU && ARC_MPU_VER !=2)
select ARCH_HAS_USERSPACE if ARC_MPU
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
config NSIM
prompt "Running on the MetaWare nSIM simulator"
bool
default n
help
v2 ISA for the ARC-HS & ARC-EM cores
For running on nSIM simulator.
config ISA_ARCV3
bool "ARC ISA v3"
select ARCH_HAS_SINGLE_THREAD_SUPPORT if !SMP
select USE_SWITCH
select USE_SWITCH_SUPPORTED
a) Uses non-XIP to run in RAM.
b) Linked at address 0x4000 with 0x4000 of RAM so that it works with
a pc_size of 16 (default).
endchoice
if ISA_ARCV2
config CPU_EM4
config DATA_ENDIANNESS_LITTLE
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 CPU
config CPU_EM4_DMIPS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU
config CPU_EM4_FPUS
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with the single-precision
floating-point extension
config CPU_EM4_FPUDA
bool
select CPU_ARCEM
help
If y, the SoC uses an ARC EM4 DMIPS CPU with single-precision
floating-point and double assist instructions
config CPU_EM6
bool
select CPU_ARCEM
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC EM6 CPU
config CPU_HS3X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS3x CPU
config CPU_HS4X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an HS4X CPU
endif #ISA_ARCV2
if ISA_ARCV3
config CPU_HS5X
bool
select CPU_ARCHS
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
config CPU_HS6X
bool
select CPU_ARCHS
select 64BIT
select CPU_HAS_DCACHE
select CPU_HAS_ICACHE
help
If y, the SoC uses an ARC HS6x CPU
endif #ISA_ARCV3
config FP_FPU_DA
bool
menu "ARC CPU Options"
config ARC_HAS_ZOL
bool
depends on ISA_ARCV2
default y
help
ARCv2 CPUs have ZOL hardware loop mechanism which the ARCv3 ISA drops.
Architecturally ZOL provides
- LPcc instruction
- LP_COUNT core reg
- LP_START, LP_END aux regs
Disabling this option removes usage of ZOL regs from code
This is driven by the processor implementation, since it is fixed in
hardware. The BSP should set this value to 'n' if the data is
implemented as big endian.
config NUM_IRQ_PRIO_LEVELS
int "Number of supported interrupt priority levels"
config NUM_IRQ_PRIO_LEVELS
int
prompt "Number of supported interrupt priority levels"
range 1 16
help
Interrupt priorities available will be 0 to NUM_IRQ_PRIO_LEVELS-1.
@@ -146,8 +84,9 @@ config NUM_IRQ_PRIO_LEVELS
The BSP must provide a valid default for proper operation.
config NUM_IRQS
int "Upper limit of interrupt numbers/IDs used"
config NUM_IRQS
int
prompt "Upper limit of interrupt numbers/IDs used"
range 17 256
help
Interrupts available will be 0 to NUM_IRQS-1.
@@ -157,290 +96,216 @@ config NUM_IRQS
The BSP must provide a valid default. This drives the size of the
vector table.
config RGF_NUM_BANKS
int "Number of General Purpose Register Banks"
depends on ARC_FIRQ
depends on NUM_IRQ_PRIO_LEVELS > 1
config RGF_NUM_BANKS
int
prompt "Number of General Purpose Register Banks"
depends on CPU_ARCV2
range 1 2
default 2
help
The ARC CPU can be configured to have more than one register
bank. If fast interrupts are supported (FIRQ), the 2nd
register bank, in the set, will be used by FIRQ interrupts.
If fast interrupts are supported but there is only 1
register bank, the fast interrupt handler must save
and restore general purpose registers.
NOTE: it's required to have more than one interrupt priority level
to use second register bank - otherwise all interrupts will use
same register bank. Such configuration isn't supported in software
and it is not beneficial from the performance point of view.
The ARC CPU can be configured to have more than one register
bank. If fast interrupts are supported (FIRQ), the 2nd
register bank, in the set, will be used by FIRQ interrupts.
If fast interrupts are supported but there is only 1
register bank, the fast interrupt handler must save
and restore general purpose regsiters.
config ARC_FIRQ
bool "FIRQ enable"
depends on ISA_ARCV2
depends on NUM_IRQ_PRIO_LEVELS > 1
depends on !ARC_HAS_SECURE
default y
help
Fast interrupts are supported (FIRQ). If FIRQ enabled, for interrupts
with highest priority, status32 and pc will be saved in aux regs,
other regs will be saved according to the number of register bank;
If FIRQ is disabled, the handle of interrupts with highest priority
will be same with other interrupts.
NOTE: we don't allow the configuration with FIRQ enabled and only one
interrupt priority level (so all interrupts are FIRQ). Such
configuration isn't supported in software and it is not beneficial
from the performance point of view.
config ARC_FIRQ_STACK
bool "Separate firq stack"
depends on ARC_FIRQ && RGF_NUM_BANKS > 1
help
Use separate stack for FIRQ handing. When the fast irq is also a direct
irq, this will get the minimal interrupt latency.
config ARC_FIRQ_STACK_SIZE
int "FIRQ stack size"
depends on ARC_FIRQ_STACK
config FIRQ_STACK_SIZE
int
prompt "Size of stack for FIRQs (in bytes)"
depends on CPU_ARCV2
default 1024
help
The size of firq stack.
FIRQs and regular IRQs have different stacks so that a FIRQ can start
running without doing stack switching in software.
config ARC_HAS_STACK_CHECKING
bool "ARC has STACK_CHECKING"
depends on ISA_ARCV2
config ARC_STACK_CHECKING
bool "Enable Stack Checking"
depends on CPU_ARCV2
default n
help
ARCV2 has a special feature allowing to check stack overflows. This
enables code that allows using this debug feature
config FAULT_DUMP
int
prompt "Fault dump level"
default 2
range 0 2
help
Different levels for display information when a fault occurs.
2: The default. Display specific and verbose information. Consumes
the most memory (long strings).
1: Display general and short information. Consumes less memory
(short strings).
0: Off.
config IRQ_OFFLOAD
bool "Enable IRQ offload"
default n
help
Enable irq_offload() API which allows functions to be synchronously
run in interrupt context. Uses one entry in the IDT. Mainly useful
for test cases.
config XIP
default n if NSIM
default y
help
ARC is configured with STACK_CHECKING which is a mechanism for
checking stack accesses and raising an exception when a stack
overflow or underflow is detected.
config ARC_CONNECT
bool "ARC has ARC connect"
select SCHED_IPI_SUPPORTED
help
ARC is configured with ARC CONNECT which is a hardware for connecting
multi cores.
config ARC_STACK_CHECKING
bool
select NO_UNUSED_STACK_INSPECTION
help
Use ARC STACK_CHECKING to do stack protection
config ARC_STACK_PROTECTION
bool
default y if HW_STACK_PROTECTION
select ARC_STACK_CHECKING if ARC_HAS_STACK_CHECKING
select MPU_STACK_GUARD if (!ARC_STACK_CHECKING && ARC_MPU && ARC_MPU_VER !=2)
select THREAD_STACK_INFO
help
This option enables either:
- The ARC stack checking, or
- the MPU-based stack guard
to cause a system fatal error
if the bounds of the current process stack are overflowed.
The two stack guard options are mutually exclusive. The
selection of the ARC stack checking is
prioritized over the MPU-based stack guard.
config ARC_USE_UNALIGNED_MEM_ACCESS
bool "Unaligned access in HW"
default y if CPU_ARCHS
depends on (CPU_ARCEM && !ARC_HAS_SECURE) || CPU_ARCHS
help
ARC EM cores w/o secure shield 2+2 mode support might be configured
to support unaligned memory access which is then disabled by default.
Enable unaligned access in hardware and make software to use it.
config ARC_CURRENT_THREAD_USE_NO_TLS
bool
select CURRENT_THREAD_USE_NO_TLS
default y if (RGF_NUM_BANKS > 1) || ("$(ZEPHYR_TOOLCHAIN_VARIANT)" = "arcmwdt")
help
Disable current Thread Local Storage for ARC. For cores with more than one
RGF_NUM_BANKS the parameter is disabled by-default because banks synchronization
requires significant time, and it slows down performance.
ARCMWDT works with TLS pointer in different way then GCC. Optimized access to
TLS pointer via the _current symbol does not provide significant advantages
in case of MetaWare.
config GEN_ISR_TABLES
default y
config GEN_IRQ_START_VECTOR
default 16
config HARVARD
bool "Harvard Architecture"
prompt "Harvard Architecture"
bool
default n
help
The ARC CPU can be configured to have two buses;
The ARC CPU can be configured to have two busses;
one for instruction fetching and another that serves as a data bus.
config CODE_DENSITY
bool "Code Density Option"
config ICCM_SIZE
int "ICCM Size in kB"
help
Enable code density option to get better code density
This option specifies the size of the ICCM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config ARC_HAS_ACCL_REGS
bool "Reg Pair ACCL:ACCH (FPU and/or MPY > 6 and/or DSP)"
default y if CPU_HS3X || CPU_HS4X || CPU_HS5X || CPU_HS6X
config ICCM_BASE_ADDRESS
hex "ICCM Base Address"
help
Depending on the configuration, CPU can contain accumulator reg-pair
(also referred to as r58:r59). These can also be used by gcc as GPR so
kernel needs to save/restore per process
This option specifies the base address of the ICCM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config ARC_HAS_SECURE
bool "ARC has SecureShield"
depends on ISA_ARCV2
select CPU_HAS_TEE
select ARCH_HAS_TRUSTED_EXECUTION
config DCCM_SIZE
int "DCCM Size in kB"
help
This option is enabled when ARC core supports secure mode
This option specifies the size of the DCCM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
config SJLI_TABLE_SIZE
int "SJLI table size"
depends on ARC_SECURE_FIRMWARE
default 8
config DCCM_BASE_ADDRESS
hex "DCCM Base Address"
help
The size of sjli (Secure Jump and Link Indexed) table. The
code in normal mode call secure services in secure mode through
sjli instruction.
This option specifies the base address of the DCCM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
config ARC_SECURE_FIRMWARE
bool "Generate Secure Firmware"
depends on ARC_HAS_SECURE
default y if TRUSTED_EXECUTION_SECURE
config SRAM_SIZE
int "SRAM Size in kB"
help
This option indicates that we are building a Zephyr image that
is intended to execute in secure mode. The option is only
applicable to ARC processors that implement the SecureShield.
This option specifies the size of the SRAM in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
This option enables Zephyr to include code that executes in
secure mode, as well as to exclude code that is designed to
execute only in normal mode.
Code executing in secure mode has access to both the secure
and normal resources of the ARC processors.
config ARC_NORMAL_FIRMWARE
bool "Generate Normal Firmware"
depends on !ARC_SECURE_FIRMWARE
depends on ARC_HAS_SECURE
default y if TRUSTED_EXECUTION_NONSECURE
config SRAM_BASE_ADDRESS
hex "SRAM Base Address"
help
This option indicates that we are building a Zephyr image that
is intended to execute in normal mode. Execution of this
image is triggered by secure firmware that executes in secure
mode. The option is only applicable to ARC processors that
implement the SecureShield.
This option specifies the base address of the SRAM on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
This option enables Zephyr to include code that executes in
normal mode only, as well as to exclude code that is
designed to execute only in secure mode.
Code executing in normal mode has no access to secure
resources of the ARC processors, and, therefore, it shall avoid
accessing them.
config ARC_VPX_COOPERATIVE_SHARING
bool "Cooperative sharing of ARC VPX vector registers"
select SCHED_CPU_MASK if MP_MAX_NUM_CPUS > 1
config FLASH_SIZE
int "Flash Size in kB"
help
This option enables the cooperative sharing of the ARC VPX vector
registers. Threads that want to use those registers must successfully
call arc_vpx_lock() before using them, and call arc_vpx_unlock()
when done using them.
This option specifies the size of the flash in kB. It is normally set by
the board's defconfig file and the user should generally avoid modifying
it via the menu configuration.
source "arch/arc/core/dsp/Kconfig"
menu "ARC MPU Options"
depends on CPU_HAS_MPU
config ARC_MPU_ENABLE
bool "Memory Protection Unit (MPU)"
select ARC_MPU
config FLASH_BASE_ADDRESS
hex "Flash Base Address"
help
Enable MPU
This option specifies the base address of the flash on the board. It is
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
source "arch/arc/core/mpu/Kconfig"
config SW_ISR_TABLE
bool
prompt "Enable software interrupt handler table"
default y
help
Enable an interrupt handler table implemented in software. This
table, unlike ISRs connected directly in the vector table, allow
a parameter to be passed to the interrupt handlers. Also, invoking
the exeception/interrupt exit stub is automatically done.
endmenu
config IRQ_VECTOR_TABLE_CUSTOM
bool
prompt "Projects provide a custom static IRQ part of vector table"
depends on !SW_ISR_TABLE
default n
help
Projects, not the BSP, provide the IRQ part of the vector table.
config DCACHE_LINE_SIZE
This is the table of interrupt handlers with the best potential
performance, but is the less flexible.
The ISRs are installed directly in the vector table, thus are
directly called by the CPU when an interrupt is taken. This adds
the least overhead when handling an interrupt.
Downsides:
- ISRs cannot have a parameter
- ISRs cannot be connected at runtime
- ISRs must notify the kernel manually by invoking _ExcExit() when
then are about to return.
config IRQ_VECTOR_TABLE_BSP
bool
# omit prompt to signify a "hidden" option
depends on SW_ISR_TABLE || !IRQ_VECTOR_TABLE_CUSTOM
default y
help
Not user-selectable, helps build system logic.
config ARCH_HAS_TASK_ABORT
bool
# omit prompt to signify a "hidden" option
default n
config ARCH_HAS_NANO_FIBER_ABORT
bool
# omit prompt to signify a "hidden" option
default n
config CACHE_LINE_SIZE_DETECT
bool
prompt "Detect d-cache line size at runtime"
default n
help
This option enables querying the d-cache build register for finding
the d-cache line size at the expense of taking more memory and code
and a slightly increased boot time.
If the CPU's d-cache line size is known in advance, disable this
option and manually enter the value for CACHE_LINE_SIZE.
config CACHE_LINE_SIZE
int
prompt "Cache line size" if !CACHE_LINE_SIZE_DETECT
default 32
help
Size in bytes of a CPU d-cache line.
config ARC_DCACHE_REGION_OPERATIONS
bool "DCACHE region operations"
depends on CACHE_MANAGEMENT && DCACHE
Detect automatically at runtime by selecting CACHE_LINE_SIZE_DETECT.
config ARCH_CACHE_FLUSH_DETECT
bool
default n
help
Perform L1 data cache management operations by regions rather than line by line in a loop,
improves performance of cache management operations.
config ARC_SLC
bool "System level cache"
depends on CACHE_MANAGEMENT && DCACHE && (CPU_HS4X || CPU_HS3X)
config CACHE_FLUSHING
bool
default n
prompt "Enable d-cache flushing mechanism"
help
This option enables System Level Cache, and adds SLC support to the data cache management operations.
config ARC_SLC_LINE_SIZE
int "SLC line size"
depends on ARC_SLC
default 128
help
Size in bytes of a CPU system level cache line.
config ARC_EXCEPTION_STACK_SIZE
int "ARC exception handling stack size"
default 768 if !64BIT
default 2048 if 64BIT
help
Size in bytes of exception handling stack which is at the top of
interrupt stack to get smaller memory footprint because exception
is not frequent. To reduce the impact on interrupt handling,
especially nested interrupt, it cannot be too large.
This links in the sys_cache_flush() function, which provides a
way to flush multiple lines of the d-cache.
If the d-cache is present, set this to y.
If the d-cache is NOT present, set this to n.
endmenu
config ARC_EARLY_SOC_INIT
bool "Make early stage SoC-specific initialization"
help
Call SoC per-core setup code on early stage initialization
(before C runtime initialization). Setup code is called in form of
soc_early_asm_init_percpu assembler macro.
# ARC vector table must be aligned to 1KiB boundary, and will be at the
# start of the ROM region.
config ROM_START_OFFSET
default 0x400 if BOOTLOADER_MCUBOOT
config MAIN_STACK_SIZE
default 4096 if 64BIT
config ISR_STACK_SIZE
default 4096 if 64BIT
config SYSTEM_WORKQUEUE_STACK_SIZE
default 4096 if 64BIT
config IDLE_STACK_SIZE
default 1024 if 64BIT
config IPM_CONSOLE_STACK_SIZE
default 2048 if 64BIT
config TEST_EXTRA_STACK_SIZE
default 2048 if 64BIT
config CMSIS_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_MAX_STACK_SIZE
default 2048 if 64BIT
config CMSIS_V2_THREAD_DYNAMIC_STACK_SIZE
default 2048 if 64BIT
source "arch/arc/soc/*/Kconfig"
endmenu

15
arch/arc/Makefile Normal file
View File

@@ -0,0 +1,15 @@
cflags-y += $(call cc-option,-ffunction-sections,) $(call cc-option,-fdata-sections,)
cflags-$(CONFIG_ARC_STACK_CHECKING) = $(call cc-option,-fomit-frame-pointer)
cflags-$(CONFIG_LTO) = $(call cc-option,-flto,)
include $(srctree)/arch/$(ARCH)/soc/$(SOC_PATH)/Makefile
KBUILD_CFLAGS += $(cflags-y)
KBUILD_CXXFLAGS += $(cflags-y)
soc-cxxflags ?= $(soc-cflags)
soc-aflags ?= $(soc-cflags)
KBUILD_CFLAGS += $(soc-cflags)
KBUILD_CXXFLAGS += $(soc-cxxflags)
KBUILD_AFLAGS += $(soc-aflags)

View File

@@ -1,5 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
if(CONFIG_ARCMWDT_LIBC OR CONFIG_CPP)
zephyr_sources(arcmwdt-dtr-stubs.c)
endif()

Some files were not shown because too many files have changed in this diff Show More