Files
zephyr/scripts/logging/dictionary/parserlib.py
Omri Sarig 152e6252fe scripts/logging: Handle partly read packets
The current implementation of the serial log parser is running in a loop -
every 2 seconds it checks if there is any information ready to be read, and
in this case, it reads this information and parses it.
In case the last packet of the information is not fully available, the
script will read only the first part of the packet, and will fail when
parsing, ending the run with an error.

This should not the be the case, as it is not an error to have only part
of the packet in the buffer.
This commit fixes this problem - now, instead of failing because the parser
does not have enough information, the parser will parse all the full
packets it have, and keep the last, partial packet in the queue, to be
parsed together with the next chunk of data.

This is done by updating the log parsers to return the amount of parsed
data when trying to parse the information, and updating the calling scripts
to correctly handle this new return value.
Additionally, the parser is now quietly failing in case of having a partial
message, and throw an exception for any other kind of error in the parsing
(instead of returning a boolean return code).

In addition to the partial packet handling, the current commit also do the
following, minor improvements:
* parserlib now fails by throwing an exception, instead of exiting - this
  is done to make it possible to choose a different handling for the errors
  from the calling code, if needed.
* The dictionary and parser are now created before the parse operation.
  This makes the uart parser more efficient, and also removes the
  previously duplicated messages of build id, target architecture and
  endianess (which was printed every time new information was received from
  the UART).

Signed-off-by: Omri Sarig <omsi@demant.com>
2025-08-19 11:39:49 +02:00

59 lines
1.8 KiB
Python
Executable File

#!/usr/bin/env python3
#
# Copyright (c) 2021 Intel Corporation
# Copyright (c) 2024 Nordic Semiconductor ASA
#
# SPDX-License-Identifier: Apache-2.0
"""
Parser library for Dictionary-based Logging
This library along with dictionary_parser converts the
input binary data to the log using log database.
"""
import logging
import dictionary_parser
from dictionary_parser.log_database import LogDatabase
def get_log_parser(dbfile, logger):
"""Get the log parser for the given database.
In addition to creating the parser, the function prints general information about the parser.
"""
database = LogDatabase.read_json_database(dbfile)
if database is None:
logger.error("ERROR: Cannot open database file: exiting...")
raise ValueError(f"Cannot open database file: {dbfile}")
log_parser = dictionary_parser.get_parser(database)
if log_parser is not None:
logger.debug("# Build ID: %s", database.get_build_id())
logger.debug("# Target: %s, %d-bit", database.get_arch(), database.get_tgt_bits())
if database.is_tgt_little_endian():
logger.debug("# Endianness: Little")
else:
logger.debug("# Endianness: Big")
else:
logger.error("ERROR: Cannot find a suitable parser matching database version!")
raise ValueError("Cannot create parser.")
return log_parser
def parser(logdata, log_parser, logger):
"""function of serial parser"""
if not isinstance(logger, logging.Logger):
raise ValueError("Invalid logger instance. Please configure the logger!")
if logdata is None:
logger.error("ERROR: cannot read log from file: exiting...")
raise ValueError("Cannot read log data.")
ret = log_parser.parse_log_data(logdata)
return ret