Skip to content

raesl.compile

ESL compiler.

Compiles ESL documents and workspaces, meaning:

  • Parsing lines
  • Typechecking
  • Building an AST
  • Instantiating components, variables and requirements.
  • Deriving dependencies from these to build an output graph (network).

EslCompilationError

Bases: Exception

Error during ESL compilation.

to_graph

to_graph(
    *paths: Union[str, Path],
    output: Optional[Union[str, Path]] = None,
    force: bool = False
) -> Graph

Convert ESL file(s) into a :obj:ragraph.graph.Graph.

Parameters:

Name Type Description Default
paths Union[str, Path]

Paths to resolve into ESL files. May be any number of files and directories to scan.

()
output Optional[Union[str, Path]]

Optional output file (JSON) to write the graph to.

None
force bool

Whether to overwrite the output file or raise an error if it the file already exists.

False

Returns:

Type Description
Graph

Instantiated graph.

Source code in src/raesl/compile/__init__.py
def to_graph(
    *paths: Union[str, Path],
    output: Optional[Union[str, Path]] = None,
    force: bool = False,
) -> Graph:
    """Convert ESL file(s) into a :obj:`ragraph.graph.Graph`.

    Arguments:
        paths: Paths to resolve into ESL files. May be any number of files and
            directories to scan.
        output: Optional output file (JSON) to write the graph to.
        force: Whether to overwrite the output file or raise an error if it the file
            already exists.

    Returns:
        Instantiated graph.
    """

    diag, spec, graph = raesl.compile.cli.run(*paths, output=output, force=force)

    if graph is None:
        errors = "\n".join(str(d) for d in diag.diagnostics)
        raise EslCompilationError(
            f"Could not compile the specification into a Graph object:\n{errors}"
        )

    return graph

ast

Abstract Syntax Tree elements for ESL.

comment_storage

Code for handling documentation comments in AST objects.

The code has two main concepts, namely DocStore which expresses the capability to find elements that can store doc comments, and DocAddElement which expresses the capability to actually store such comments.

In general thus, storing a doc comment thus involves two steps. First the correct element is searched based on a supplied (possibly dotted) name, then the comment is actually stored if an element could be found.

As the above is pretty minimal, several additional classes exist. - The DocElement can both store comments and also provide them again. (Compound element can't store comments, but forward them to their children and thus have no way to retrieve the comments given to them.) - The DefaultDocElement implements DocElement. - The DefaultDocStore implements a DocStore as well as DocElement for non-dotted names, providing a simple way to add doc comment storage to many of the language elements.

Internally, some more classes exist.

  • The ProxyDocStore that acts as DocStore elsewhere in the specification. Its primary use is in comment sections, pointing to the actual elements to receive the doc comments of the section.

  • The DummyElement acts as a DocStore for any doc comment, and reports a warning about them being ignored. These elements act as barrier and guard against doc comments being added to a previous element where that should not happen. For example, doc comments after a 'define type' line, should not be added to the definitions above that line.

  • Finally, DocCommentDistributor implements the distribution of doc comments to all elements of the specification after type-checking has been performed.

DefaultDocElement

DefaultDocElement()

Bases: DocElement

Default implementation for storing and retrieving doc comments.

Attributes:

Name Type Description
comments List[str]

The comments themselves, non-empty text after dropping the leading '#<' and surrounding white-space.

Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self) -> None:
    super(DefaultDocElement, self).__init__()
    self.comments: List[str] = []
add_comment
add_comment(comment_tok: Token) -> None

Add found documentation comment.

Parameters:

Name Type Description Default
comment_tok Token

The raw documentation token to add.

required
Source code in src/raesl/compile/ast/comment_storage.py
def add_comment(self, comment_tok: Token) -> None:
    """Add found documentation comment.

    Arguments:
        comment_tok: The raw documentation token to add.
    """
    self.comments.extend(decode_doc_comments(comment_tok))

DefaultDocStore

DefaultDocStore(doc_tok: Optional[Token])

Bases: DocStore, DefaultDocElement

Class that can store and retrieve doc-comments for non-dotted names.

Parameters:

Name Type Description Default
doc_tok Optional[Token]

Token defining the position of the element in the input for documenting. Documentation comments after this position and before any other existing DocStore.doc_tok will be attached to this element.

required
Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self, doc_tok: Optional[Token]):
    # Can't use super() due to different argument lists.
    DocStore.__init__(self, doc_tok)
    DefaultDocElement.__init__(self)

DocAddElement

Interface class of an element that can store doc comments.

add_comment
add_comment(comment_tok: Token) -> None

Add found documentation comment.

Parameters:

Name Type Description Default
comment_tok Token

The raw documentation token to add.

required
Source code in src/raesl/compile/ast/comment_storage.py
def add_comment(self, comment_tok: Token) -> None:
    """Add found documentation comment.

    Arguments:
        comment_tok: The raw documentation token to add.
    """
    raise NotImplementedError("Implement me in {}".format(repr(self)))

DocCommentDistributor

DocCommentDistributor(diag_store: DiagnosticStore)

Class for assigning documentation comments to relevant elements in the specification.

Parameters:

Name Type Description Default
diag_store DiagnosticStore

Storage for reported diagnostics.

required

Attributes:

Name Type Description
elements List[DocStore]

Elements interested in receiving documentation comments.

dummy_elements List[DummyElement]

Elements that catch documentation comments without a proper home, for warning the user about such comments.

Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self, diag_store: diagnostics.DiagnosticStore):
    self.diag_store = diag_store
    self.elements: List[DocStore] = []
    self.dummy_elements: List[DummyElement] = []

    # This 'tok' violates pretty much all assumptions of Token, do not try to use
    # it outside the distributor context.
    tok = Token(
        tok_type="DUMMY_TK",
        tok_text="",
        fname=None,
        offset=-1,
        line_offset=-1,
        line_num=-1,
    )
    self.add_dummy_element(tok)
add_dummy_element
add_dummy_element(
    doc_tok: Token, report_errors: bool = True
)

Insert a dummy element based on the provided token.

Parameters:

Name Type Description Default
doc_tok Token

Token that defines the position of the dummy element.

required
report_errors bool

Whether to report errors for comments that get attached to the dummy element.

True
Source code in src/raesl/compile/ast/comment_storage.py
def add_dummy_element(self, doc_tok: Token, report_errors: bool = True):
    """Insert a dummy element based on the provided token.

    Arguments:
        doc_tok: Token that defines the position of the dummy element.
        report_errors: Whether to report errors for comments that get attached to
            the dummy element.
    """
    dds = DummyElement(doc_tok, report_errors)
    self.dummy_elements.append(dds)
    self.add_element(dds)
add_element
add_element(element: DocStore)

Add the provided element to the elements interested in getting doc comments.

Parameters:

Name Type Description Default
element DocStore

Element to add.

required
Source code in src/raesl/compile/ast/comment_storage.py
def add_element(self, element: DocStore):
    """Add the provided element to the elements interested in getting doc comments.

    Arguments:
        element: Element to add.
    """
    self.elements.append(element)
resolve
resolve(doc_comments: List[Token])

Distribute the provided documentation comments to the interested elements, and report any documentation comments that get assigned in a DummyElement, as those are at the wrong spot in the specification.

Parameters:

Name Type Description Default
doc_comments List[Token]

Documentation comments found in the input specification.

required
Source code in src/raesl/compile/ast/comment_storage.py
def resolve(self, doc_comments: List[Token]):
    """Distribute the provided documentation comments to the interested elements,
    and report any documentation comments that get assigned in a DummyElement,
    as those are at the wrong spot in the specification.

    Arguments:
        doc_comments: Documentation comments found in the input specification.
    """
    if not doc_comments:  # No comments, nothing to do.
        return

    # At this point, there is
    # - the provided doc_comments, a mostly sorted list of 'DOC_COMMENT_TK' tokens
    #   containing the documentation comments.
    # - the self.elements list, the interested elements, including all dummy
    #   elements for reporting about doc comments at weird places. The tokens of the
    #   former must be assigned to the latter based on offset information and file.
    #   All doc comments must be assigned to that element with the largest offset
    #   less than the offset of the comment within the same file.

    # Sort the comments on file name and offset.
    dc_dict = defaultdict(list)
    for dc in doc_comments:
        dc_dict[dc.fname].append(dc)

    for val in dc_dict.values():
        val.sort(key=lambda tok: tok.offset)

    # The tricky part is that the element list may have several elements at the
    # same offset, due to blindly inserting dummy elements. This can be resolved by
    # either being more careful about offsets of dummy elements, or by filtering
    # such duplicates afterwards.
    #
    # The code here does the latter. Sort the interested elements, and filter the
    # dummy duplicate elements out of it. Also append a None element to notify
    # about the end of the list.

    # Sort the elements on file name and offset.
    elm_dict = defaultdict(list)
    for elm in self.elements:
        fname = getattr(elm.doc_tok, "fname", "unknown-file")
        elm_dict[fname].append(elm)

    for key, val in elm_dict.items():
        offsets = [v.doc_tok.offset for v in val]
        sequence = [idx for _, idx in sorted(zip(offsets, range(len(val))))]
        elm_dict[key] = [val[idx] for idx in sequence]

    nondup_elm_dict = {}
    for key, val in elm_dict.items():
        nondup_elm_dict[key] = list(_drop_dup_dummies(val))

    # Last element before the position at 'doc_index'.
    cur_elem: Optional[DocStore] = None

    # Next element after cur_elem, or None if not initialized or at the end.
    next_elem: Optional[DocStore] = None

    for cur_file in nondup_elm_dict:
        if cur_file not in dc_dict:
            continue

        dc_list = dc_dict[cur_file]
        doc_index = 0
        cur_elem = None
        next_elem = None

        for elem in itertools.chain(nondup_elm_dict[cur_file], _gen_none()):
            if cur_elem is None and next_elem is None:
                # At startup, setup next_elem for the next iteration
                next_elem = elem
                # Holds as the distributor inserts a dummy element at offset -1
                # thus it is always a non-empty list.
                assert next_elem is not None

                # Paranoia check, first comment should be at or after that element.
                comment_offset = dc_list[doc_index].offset
                assert next_elem.doc_tok
                if comment_offset < next_elem.doc_tok.offset:
                    loc = dc_list[doc_index].get_location(comment_offset)
                    self.diag_store.add(
                        diagnostics.W300(element=None, location=loc, comments=[loc])
                    )

                continue

            # Regular iteration
            cur_elem = next_elem
            next_elem = elem  # Might be None
            # Find the document element pointed at by 'cur_elem'.
            assert cur_elem is not None
            assert cur_elem.doc_tok is not None
            cur_doc_element = cur_elem.resolve_element(cur_elem.doc_tok.tok_text)
            if cur_doc_element is None:
                # Element name cannot be resolved, throw an error and ignore it
                # further.
                offset = cur_elem.get_error_position(cur_elem.doc_tok.tok_text)
                loc = cur_elem.doc_tok.get_location(offset)
                name = cur_elem.doc_tok.tok_text
                self.diag_store.add(diagnostics.W300(element=name, location=loc))

            # Process the doc comments belonging to the current element.
            while doc_index < len(dc_list):
                comment_offset = dc_list[doc_index].offset

                # If next comment is at or after the next element, cur_elem is done.
                if (
                    next_elem is not None
                    and next_elem.doc_tok is not None
                    and comment_offset >= next_elem.doc_tok.offset
                ):
                    break

                if cur_doc_element is not None:
                    # Reported a problem about failing to resolve already, thus
                    # silently skipping is fine.
                    cur_doc_element.add_comment(dc_list[doc_index])

                doc_index = doc_index + 1

            # If all comments have been processed, ignore the remaining elements.
            if doc_index >= len(dc_list):
                break

        # All doc comments distributed, check that nothing ended up at a weird spot.
        for elem in self.dummy_elements:
            elem.report(self.diag_store)

DocElement

Bases: DocAddElement

Interface class of an element that can store and retrieve doc comments.

get_comment
get_comment() -> List[str]

Retrieve the stored comments.

Source code in src/raesl/compile/ast/comment_storage.py
def get_comment(self) -> List[str]:
    """Retrieve the stored comments."""
    raise NotImplementedError("Implement me in {}".format(repr(self)))

DocStore

DocStore(doc_tok: Optional[Token])

Interface class that can find where to store doc comments for a given name. If doc_tok is None, the element does not get any documentation comments.

Parameters:

Name Type Description Default
doc_tok Optional[Token]

Token defining the position of the element in the input for documenting. Documentation comments after this position and before any other existing DocStore.doc_tok will be attached to this element.

required
Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self, doc_tok: Optional[Token]):
    self.doc_tok = doc_tok
get_error_position
get_error_position(name: str) -> int

Return the index in the given string where an error occurs in resolving the name.

Parameters:

Name Type Description Default
name str

Name of the element to find.

required

Returns:

Type Description
int

Approximated index in the string where matching the element fails. Returned value has no meaning if resolving succeeds.

Source code in src/raesl/compile/ast/comment_storage.py
def get_error_position(self, name: str) -> int:
    """Return the index in the given string where an error occurs in resolving the
    name.

    Arguments:
        name: Name of the element to find.

    Returns:
        Approximated index in the string where matching the element fails.
            Returned value has no meaning if resolving succeeds.
    """
    return 0
resolve_element
resolve_element(name: str) -> Optional[DocAddElement]

Try to find the documentation element indicated by its (dotted) name.

Parameters:

Name Type Description Default
name str

Name of the element to find.

required

Returns:

Type Description
Optional[DocAddElement]

The documentation element associated with the provided name if it can be resolved.

Source code in src/raesl/compile/ast/comment_storage.py
def resolve_element(self, name: str) -> Optional[DocAddElement]:
    """Try to find the documentation element indicated by its (dotted) name.

    Arguments:
        name: Name of the element to find.

    Returns:
        The documentation element associated with the provided name if it can be
            resolved.
    """
    raise NotImplementedError("Implement me in {}".format(repr(self)))

DummyElement

DummyElement(doc_tok: Token, report_errors: bool = True)

Bases: DocStore, DocElement

Class for catching documentation elements that have no proper owner. Used for reporting warnings that such documentation is ignored.

Parameters:

Name Type Description Default
doc_tok Token

Token defining the position of the element in the input for documenting. Documentation comments after this position and before any other existing DocStore.doc_tok will be attached to this element.

required
report_errors bool

Whether to add errors to the problem storage.

True
Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self, doc_tok: Token, report_errors: bool = True):
    super(DummyElement, self).__init__(doc_tok)
    self.raw_comments: List[Token] = []
    self.report_errors = report_errors
report
report(diag_store: DiagnosticStore)

Report a warning about all received documentation comments.

Parameters:

Name Type Description Default
diag_store DiagnosticStore

Storage for reported diagnostics.

required
Source code in src/raesl/compile/ast/comment_storage.py
def report(self, diag_store: diagnostics.DiagnosticStore):
    """Report a warning about all received documentation comments.

    Arguments:
        diag_store: Storage for reported diagnostics.
    """
    if not self.raw_comments or not self.report_errors:
        return

    locs = [tok.get_location() for tok in self.raw_comments]
    diag_store.add(diagnostics.W300(location=locs[0], comments=locs))

ProxyDocStore

ProxyDocStore(doc_tok: Token, real_element: DocStore)

Bases: DocStore

Proxy element that represents a real DocStore element except at a different position in the specification. This is useful in 'comment' sections where names of elements are provided that exist elsewhere in the component.

Parameters:

Name Type Description Default
doc_tok Token

Token defining the position of the element in the input for documenting. Documentation comments after this position and before any other existing DocStore.doc_tok will be attached to this element.

required
real_element DocStore

Real element which is symbolically at the 'doc_tok' position, too.

required
Source code in src/raesl/compile/ast/comment_storage.py
def __init__(self, doc_tok: Token, real_element: DocStore):
    super(ProxyDocStore, self).__init__(doc_tok)
    self.real_element = real_element

decode_doc_comments

decode_doc_comments(comment_tok: Token) -> List[str]

Convert a doc comment token to the containing description text.

Parameters:

Name Type Description Default
comment_tok Token

Token with the documentation comment.

required

Returns:

Type Description
List[str]

The text (for as far as it exists).

Source code in src/raesl/compile/ast/comment_storage.py
def decode_doc_comments(comment_tok: Token) -> List[str]:
    """Convert a doc comment token to the containing description text.

    Arguments:
        comment_tok: Token with the documentation comment.

    Returns:
        The text (for as far as it exists).
    """
    assert comment_tok.tok_text.startswith("#<")
    text = comment_tok.tok_text[2:].strip()
    if text:
        return [text]
    return []

components

Component definitions with their contents.

BehaviorCase

BehaviorCase(
    name_tok: Token,
    conditions: List[BehaviorCondition],
    results: List[BehaviorResult],
)

A set of desired behavioral results given a set of conditions.

Parameters:

Name Type Description Default
name_tok Token

Name of the behavior case.

required
conditions List[BehaviorCondition]

Conditions that should hold for the case to apply.

required
results List[BehaviorResult]

Results that should hold when the case applies.

required
Source code in src/raesl/compile/ast/components.py
def __init__(
    self,
    name_tok: "Token",
    conditions: List[BehaviorCondition],
    results: List[BehaviorResult],
):
    self.name_tok = name_tok
    self.conditions = conditions
    self.results = results

BehaviorCondition

BehaviorCondition(
    name_tok: Token,
    comparison: Union[Disjunction, RelationComparison],
)

A condition of a case.

Source code in src/raesl/compile/ast/components.py
def __init__(self, name_tok: "Token", comparison: Union["Disjunction", "RelationComparison"]):
    self.name_tok = name_tok
    self.comparison = comparison

BehaviorFunction

BehaviorFunction(behavior_kind: str, name_tok: Token)

Bases: DefaultDocStore

One function specifying some behavior.

Parameters:

Name Type Description Default
behavior_kind str

Kind of behavior. Either 'requirement' or 'constraint'.

required
name_tok Token

Name of the behavior.

required

Attributes:

Name Type Description
cases List[BehaviorCase]

Behavior cases.

default_results Optional[List[BehaviorResult]]

Results that hold when none of the cases applies. None means there is no default result.

Source code in src/raesl/compile/ast/components.py
def __init__(self, behavior_kind: str, name_tok: "Token"):
    super(BehaviorFunction, self).__init__(name_tok)
    self.behavior_kind = behavior_kind
    self.name_tok = name_tok
    self.cases: List[BehaviorCase] = []
    self.default_results: Optional[List[BehaviorResult]] = None

BehaviorResult

BehaviorResult(name_tok: Token, result: Comparison)

A result of a case.

Source code in src/raesl/compile/ast/components.py
def __init__(self, name_tok: "Token", result: "Comparison"):
    self.name_tok = name_tok
    self.result = result

ComponentDefinition

ComponentDefinition(
    pos_tok: Token, name_tok: Optional[Token]
)

Bases: DefaultDocStore

ESL component definition.

Parameters:

Name Type Description Default
pos_tok Token

Position of the definition. Either the name token or the 'world' token.

required
name_tok Optional[Token]

Token with the name of the component definition, None means 'world'.

required

Attributes:

Name Type Description
variables List[VarParam]

Variables of the component definition.

parameters List[VarParam]

Parameters of the component definition.

var_groups List[VariableGroup]

Groups of variables with a name.

component_instances List[ComponentInstance]

Component instances of the component definition.

needs List[Need]

Needs of the component definition.

goals List[Goal]

Goals of the component definition.

transforms List[Transformation]

Transformations of the component definition.

designs List[Design]

Designs of the component definition.

relations List[RelationInstance]

Relation instances of the component definition.

behaviors List[BehaviorFunction]

Behavior functions of the component definition.

Source code in src/raesl/compile/ast/components.py
def __init__(self, pos_tok: "Token", name_tok: Optional["Token"]):
    super(ComponentDefinition, self).__init__(name_tok)
    self.pos_tok = pos_tok
    self.name_tok = name_tok

    self.variables: List[VarParam] = []
    self.parameters: List[VarParam] = []
    self.var_groups: List[VariableGroup] = []
    self.component_instances: List[ComponentInstance] = []
    self.needs: List[Need] = []
    self.goals: List[Goal] = []
    self.transforms: List[Transformation] = []
    self.designs: List[Design] = []
    self.relations: List[RelationInstance] = []
    self.behaviors: List["BehaviorFunction"] = []

ComponentInstance

ComponentInstance(
    inst_name_tok: Token, def_name_tok: Token
)

Bases: DefaultDocStore

ESL component instance in a component definition.

Parameters:

Name Type Description Default
inst_name_tok Token

Token with the name of the component instance.

required
def_name_tok Token

Token withe the name of the component definition to apply.

required

Attributes:

Name Type Description
arguments List[InstanceArgument]

Arguments of the instance.

compdef Optional[ComponentDefinition]

Component definition matching the name in 'def_name_tok', if it exists. Set during type checking.

Source code in src/raesl/compile/ast/components.py
def __init__(self, inst_name_tok: "Token", def_name_tok: "Token"):
    super(ComponentInstance, self).__init__(inst_name_tok)
    self.inst_name_tok = inst_name_tok
    self.def_name_tok = def_name_tok
    self.arguments: List[InstanceArgument] = []

    self.compdef: Optional[ComponentDefinition] = None

Design

Design(label_tok: Token, expr: Expression)

Bases: DefaultDocStore

Design rule in a component.

Parameters:

Name Type Description Default
label_tok Token

Name of the design rule.

required
expr Expression

Condition expressed in the design.

required

Attributes:

Name Type Description
design_kind Optional[str]

Kind of the design, filled in after construction. Contains either 'requirement' or 'constraint'.

sub_clauses List[SubClause]

Sub-clauses of the design.

Source code in src/raesl/compile/ast/components.py
def __init__(self, label_tok: "Token", expr: "Expression"):
    super(Design, self).__init__(label_tok)
    self.design_kind: Optional[str] = None
    self.label_tok = label_tok
    self.expr = expr
    self.sub_clauses: List[SubClause] = []

Flow

Flow(name_tok: Token)

Flow in a goal or Transformation.

Parameters:

Name Type Description Default
name_tok Token

Dotted name of the flow.

required

Attributes:

Name Type Description
flow_node Optional[VarNode]

If not None, node represented by the flow.

Source code in src/raesl/compile/ast/components.py
def __init__(self, name_tok: "Token") -> None:
    self.name_tok = name_tok
    self.flow_node: Optional[VarNode] = None

Goal

Goal(
    label_tok: Token,
    active: Token,
    doesaux: Token,
    verb: Token,
    flows: List[Flow],
    prepos: Token,
    passive: Token,
)

Bases: DefaultDocStore

Goal in an ESL component definition.

Parameters:

Name Type Description Default
label_tok Token

Label name of the goal.

required
active Token

Token with the name of the active component.

required
doesaux Token

'does' or auxiliary word token.

required
verb Token

Verb of the goal.

required
flows List[Flow]

Flows of the goal.

required
prepos Token

Token with the preposition word.

required
passive Token

Token with the name of the passive component.

required

Attributes:

Name Type Description
goal_kind Optional[str]

Kind of goal, filled after construction. Either 'requirement' or 'constraint' string.

sub_clauses List[SubClause]

Sub-clauses of the goal.

active_comp Optional[ComponentInstance]

If not None, resolved active component instance of the goal.

passive_comp Optional[ComponentInstance]

If not None, resolved passive component instance of the goal.

Source code in src/raesl/compile/ast/components.py
def __init__(
    self,
    label_tok: "Token",
    active: "Token",
    doesaux: "Token",
    verb: "Token",
    flows: List[Flow],
    prepos: "Token",
    passive: "Token",
):
    super(Goal, self).__init__(label_tok)
    self.goal_kind: Optional[str] = None
    self.label_tok = label_tok
    self.active = active
    self.doesaux = doesaux
    self.verb = verb
    self.flows = flows
    self.prepos = prepos
    self.passive = passive
    self.sub_clauses: List[SubClause] = []

    self.active_comp: Optional[ComponentInstance] = None
    self.passive_comp: Optional[ComponentInstance] = None

InstanceArgument

InstanceArgument(
    name_tok: Token, argnode: Optional[Node] = None
)

Actual argument of a component or relation.

Parameters:

Name Type Description Default
name_tok Token

Name of the actual argument.

required
argnode Optional[Node]

Node of the argument, filled during type checking.

None
Source code in src/raesl/compile/ast/components.py
def __init__(self, name_tok: "Token", argnode: Optional["Node"] = None):
    self.name_tok = name_tok
    self.argnode = argnode

Need

Need(
    label_tok: Token, subject_tok: Token, description: str
)

Bases: DefaultDocStore

Informal need in ESL.

Parameters:

Name Type Description Default
label_tok Token

Token with the name of the label.

required
subject_tok Token

Token with the name of the subject of the need.

required
description str

Description of the need.

required

Attributes:

Name Type Description
subject Optional[NeedSubjectTypes]

If not None, subject of the need.

Source code in src/raesl/compile/ast/components.py
def __init__(self, label_tok: "Token", subject_tok: "Token", description: str):
    super(Need, self).__init__(label_tok)
    self.label_tok = label_tok
    self.subject_tok = subject_tok
    self.description = description

    self.subject: Optional[NeedSubjectTypes] = None

RelationInstance

RelationInstance(
    inst_name_tok: Token,
    def_name_tok: Token,
    arguments: List[List[InstanceArgument]],
    reldef: Optional[RelationDefinition],
)

Bases: DefaultDocStore

ESL relation instance in a component definition.

Parameters:

Name Type Description Default
inst_name_tok Token

Token with the name of the relation instance.

required
def_name_tok Token

Token withe the name of the relation definition to apply.

required
arguments List[List[InstanceArgument]]

Arguments of the instance. One element for each parameter, where one element may have several arguments due to the 'one or more' feature.

required
reldef Optional[RelationDefinition]

Relation definition of this instance.

required
Source code in src/raesl/compile/ast/components.py
def __init__(
    self,
    inst_name_tok: "Token",
    def_name_tok: "Token",
    arguments: List[List[InstanceArgument]],
    reldef: Optional["RelationDefinition"],
):
    super(RelationInstance, self).__init__(inst_name_tok)
    self.inst_name_tok = inst_name_tok
    self.def_name_tok = def_name_tok
    self.arguments = arguments
    self.reldef = reldef

SubClause

SubClause(label_tok: Token, expr: Expression)

Subclause in a goal, transformation, or behavior.

Parameters:

Name Type Description Default
label_tok Token

Name of the subclause.

required
expr Expression

Expression describing the subclause.

required
Source code in src/raesl/compile/ast/components.py
def __init__(self, label_tok: "Token", expr: "Expression"):
    self.label_tok = label_tok
    self.expr = expr

Transformation

Transformation(
    label_tok: Token,
    doesaux_tok: Token,
    verb_tok: Token,
    in_flows: List[Flow],
    prepos_tok: Token,
    out_flows: List[Flow],
)

Bases: DefaultDocStore

Transformation in a component.

Parameters:

Name Type Description Default
label_tok Token

Label name of the transformation.

required
doesaux_tok Token

'does' or aux word token.

required
verb_tok Token

Verb of the transformation.

required
in_flows List[Flow]

Inputs required for the transformation.

required
prepos_tok Token

Preposition of the transformation.

required
out_flows List[Flow]

Outputs resulting from the transformation.

required

Attributes:

Name Type Description
transform_kind Optional[str]

Kind of transformation, filled after construction. Either 'requirement' or 'constraint' string.

sub_clauses List[SubClause]

Sub-clauses of the transformation.

Source code in src/raesl/compile/ast/components.py
def __init__(
    self,
    label_tok: "Token",
    doesaux_tok: "Token",
    verb_tok: "Token",
    in_flows: List[Flow],
    prepos_tok: "Token",
    out_flows: List[Flow],
):
    super(Transformation, self).__init__(label_tok)
    self.transform_kind: Optional[str] = None
    self.label_tok = label_tok
    self.doesaux_tok = doesaux_tok
    self.verb_tok = verb_tok
    self.in_flows = in_flows
    self.prepos_tok = prepos_tok
    self.out_flows = out_flows
    self.sub_clauses: List[SubClause] = []

VarParam

VarParam(
    is_variable: bool,
    name_tok: Token,
    type_tok: Token,
    is_property: bool = False,
)

Bases: DocStore

ESL component definition variable or parameter.

Parameters:

Name Type Description Default
is_variable bool

Whether the object represents a variable.

required
name_tok Token

Token with the name of the variable being defined.

required
type_tok Token

Token with the name of the type of the variable being defined.

required
is_property bool

Whether the parameter is a property.

False

Attributes:

Name Type Description
type Optional[BaseType]

Type of the variable, if it exists. Set during type checking.

Source code in src/raesl/compile/ast/components.py
def __init__(
    self,
    is_variable: bool,
    name_tok: "Token",
    type_tok: "Token",
    is_property: bool = False,
):
    super(VarParam, self).__init__(name_tok)
    self.is_variable = is_variable
    self.name_tok = name_tok
    self.type_tok = type_tok
    self.is_property = is_property
    self.type: Optional["BaseType"] = None

    self.node: Optional["VarNode"] = None

    # Variables are never explicit property, since they naturally belong
    # to the component defining them.
    assert not self.is_property or not self.is_variable
get_error_position
get_error_position(name: str) -> int

Return the index in the given string where an error occurs in resolving the node.

Parameters:

Name Type Description Default
name str

Name of the element to find.

required

Returns:

Type Description
int

Approximated index in the string where matching the element fails.

int

Returned value has no meaning if resolving a node succeeds.

Source code in src/raesl/compile/ast/components.py
def get_error_position(self, name: str) -> int:
    """Return the index in the given string where an error occurs in resolving the
    node.

    Arguments:
        name: Name of the element to find.

    Returns:
        Approximated index in the string where matching the element fails.
        Returned value has no meaning if resolving a node succeeds.
    """
    local_name, remaining_name, dot_length = split_first_dot(name)

    if self.name_tok.tok_text != local_name:
        return 0  # Local name == first name is wrong.
    else:
        # Ask child about the position of the error.
        offset = self.node.get_error_position(remaining_name)
        return offset + len(local_name) + dot_length
resolve_node
resolve_node(name: str) -> Optional[VarNode]

Find the varparam (sub) node that matches the dotted 'name'.

Parameters:

Name Type Description Default
name str

Possibly dotted name that should point at an existing sub-node. The empty string denotes 'self'.

required

Returns:

Type Description
Optional[VarNode]

The node that matches the name, or None if no such node exists. In the latter case, use 'self.get_error_position(name)' to get an indication where the match fails in the name.

Source code in src/raesl/compile/ast/components.py
def resolve_node(self, name: str) -> Optional["VarNode"]:
    """Find the varparam (sub) node that matches the dotted 'name'.

    Arguments:
        name: Possibly dotted name that should point at an existing sub-node.
            The empty string denotes 'self'.

    Returns:
        The node that matches the name, or None if no such node exists. In the
            latter case, use 'self.get_error_position(name)' to get
            an indication where the match fails in the name.
    """
    local_name, remaining_name, _dot_length = split_first_dot(name)

    if self.name_tok.tok_text != local_name:
        return None

    assert self.node is not None, "Trying to use non-existing node of '{}'".format(
        self.name_tok.tok_text
    )
    return self.node.resolve_node(remaining_name)

VariableGroup

VariableGroup(
    name_tok: Token, variablepart_names: List[Token]
)

One variable group in ESL (a named group of variables).

It has no documentation comment, as its only purpose is to enable interfacing to child components.

As a variable group doesn't need to contain uniquely named variables, their names cannot be used to build a Compound type. Therefore, it just stays a group, and it gets dealt with in component instantiation.

Parameters:

Name Type Description Default
name_tok Token

Token with the name of the group being defined.

required
variablepart_names List[Token]

Tokens with possibly dotted name of variable parts in the group.

required

Attributes:

Name Type Description
node Optional[Node]

Node representing the group, if available.

Source code in src/raesl/compile/ast/components.py
def __init__(self, name_tok: "Token", variablepart_names: List["Token"]):
    self.name_tok = name_tok
    self.variablepart_names = variablepart_names

    # Set during type checking.
    self.node: Optional["Node"] = None

get_doc_comment_comp_elements

get_doc_comment_comp_elements(
    comp: ComponentDefinition,
) -> Generator[DocStore, None, None]

Retrieve the component elements interested in getting documentation comments from the input. This includes the component itself, so you can add documentation to it in its 'comments' section.

Parameters:

Name Type Description Default
comp ComponentDefinition

Component definition to search.

required

Returns:

Type Description
None

Generator yielding interested elements.

Source code in src/raesl/compile/ast/components.py
def get_doc_comment_comp_elements(
    comp: ComponentDefinition,
) -> Generator[comment_storage.DocStore, None, None]:
    """Retrieve the component elements interested in getting documentation comments from
    the input. This includes the component itself, so you can add documentation to it
    in its 'comments' section.

    Arguments:
        comp: Component definition to search.

    Returns:
        Generator yielding interested elements.
    """
    all_elems: List[
        Union[
            List[ComponentDefinition],
            List[VarParam],
            List[ComponentInstance],
            List[Need],
            List[Goal],
            List[Transformation],
            List[Design],
            List[RelationInstance],
            List[BehaviorFunction],
        ]
    ] = [
        [comp],
        comp.variables,
        comp.parameters,
        comp.component_instances,
        comp.needs,
        comp.goals,
        comp.transforms,
        comp.designs,
        comp.relations,
        comp.behaviors,
    ]
    for elems in all_elems:
        for elem in elems:
            if elem.doc_tok:
                yield elem

exprs

Expressions to store and reason about values and boundaries.

Comparison

Comparison(is_constraint: bool)

Bases: Expression

Class storing a comparison.

Parameters:

Name Type Description Default
is_constraint bool

Whether the comparison is considered to be a constraint rather than a requirement.

required
Source code in src/raesl/compile/ast/exprs.py
def __init__(self, is_constraint: bool):
    super(Comparison, self).__init__()
    self.is_constraint = is_constraint

DataValue

Some kind of data value. Do not use this, but use a derived class instead.

get_units
get_units() -> Optional[Set[str]]

Obtain the units of the value. Gives a set of names without square brackets where lack of units results in the empty set, and not supporting units gives None.

Returns:

Type Description
Optional[Set[str]]

Names of the available units without square brackets or None.

Source code in src/raesl/compile/ast/exprs.py
def get_units(self) -> Optional[Set[str]]:
    """Obtain the units of the value. Gives a set of names without square brackets
    where lack of units results in the empty set, and not supporting units gives
    None.

    Returns:
        Names of the available units without square brackets or None.
    """
    raise NotImplementedError("Implement me in {}".format(repr(self)))

Disjunction

Disjunction(childs: Sequence[Expression])

Bases: Expression

Disjunctive expression (also known as 'or' expression). It is true iff at least one of it child expressions is true.

Parameters:

Name Type Description Default
childs Sequence[Expression]

Child expressions of the disjunction. It is recommended to have at least two children in an object.

required
Source code in src/raesl/compile/ast/exprs.py
def __init__(self, childs: Sequence[Expression]):
    self.childs = childs

Expression

Base class of an expression.

ObjectiveComparison

ObjectiveComparison(
    lhs_var: VariableValue, aux_tok: Token, maximize: bool
)

Bases: Comparison

An intended direction for a variable. Note that the 'maximize' parameter controls both the 'maximize' and 'minimize' desires.

Parameters:

Name Type Description Default
lhs_var VariableValue

Variable with the objective.

required
aux_tok Token

One of the auxiliary verbs expressing strength of the objective.

required
maximize bool

If set the comparison expresses the desire to maximize the variable.

required

Attributes:

Name Type Description
minimize bool

Opposite of maximize.

Source code in src/raesl/compile/ast/exprs.py
def __init__(self, lhs_var: VariableValue, aux_tok: "Token", maximize: bool):
    super(ObjectiveComparison, self).__init__(False)
    self.lhs_var = lhs_var
    self.aux_tok = aux_tok
    self.maximize = maximize
get_location
get_location() -> Location

Return a location to point at the comparison for error reporting purposes.

Source code in src/raesl/compile/ast/exprs.py
def get_location(self) -> "Location":
    """Return a location to point at the comparison for error reporting purposes."""
    return self.aux_tok.get_location()

RelationComparison

RelationComparison(
    is_constraint: bool,
    lhs_var: VariableValue,
    isaux_tok: Token,
    cmp_tok: Token,
    rhs_varval: DataValue,
)

Bases: Comparison

A relation between a variable and either a value or a variable.

Parameters:

Name Type Description Default
is_constraint bool

Whether the comparison is considered to be a constraint rather than a requirement.

required
lhs_var VariableValue

Left hand side variable being compared.

required
isaux_tok Token

'is' for a constraint, else the aux word for expressing strength of the comparison.

required
cmp_tok Token

One of the key words that expression the comparison to perform.

required
rhs_varval DataValue

Right hand side variable or value.

required

Attributes:

Name Type Description
math_compare

Translated 'cmp_tok', with the ascii math text.

Source code in src/raesl/compile/ast/exprs.py
def __init__(
    self,
    is_constraint: bool,
    lhs_var: VariableValue,
    isaux_tok: "Token",
    cmp_tok: "Token",
    rhs_varval: DataValue,
):
    super(RelationComparison, self).__init__(is_constraint)
    self.lhs_var = lhs_var
    self.isaux_tok = isaux_tok
    self.cmp_tok = cmp_tok
    self.math_compare = _MATH_OP_TRANSLATE[cmp_tok.tok_type]
    self.rhs_varval = rhs_varval
get_location
get_location() -> Location

Return a location to point at the comparison for error reporting purposes.

Source code in src/raesl/compile/ast/exprs.py
def get_location(self) -> "Location":
    """Return a location to point at the comparison for error reporting purposes."""
    return self.isaux_tok.get_location()

Value

Value(value: Token, unit: Optional[Token] = None)

Bases: DataValue

A value with an optional unit. Don't modify these objects in-place, create a new object instead.

Parameters:

Name Type Description Default
value Token

Stored value as text.

required
unit Optional[Token]

Either None or text describing the unit. Treat as read-only, as changing it may break the cache.

None

Attributes:

Name Type Description
_unit_cache Optional[Set[str]]

Units of the literal after normalizing self.unit. Computed on demand.

Source code in src/raesl/compile/ast/exprs.py
def __init__(self, value: "Token", unit: Optional["Token"] = None):
    super(Value, self).__init__()
    self.value = value
    self.unit = unit
    self._unit_cache: Optional[Set[str]] = None  # Lazily computed.

    assert self.unit is None or self.unit.tok_text not in ("", "[]")

VariableValue

VariableValue(var_tok: Token)

Bases: DataValue

Class representing a variable or parameter as value.

Parameters:

Name Type Description Default
var_tok Token

Token stating the possibly dotted name of the variable.

required

Attributes:

Name Type Description
var_node Optional[VarNode]

If not None, the node represented by the object. Set during type checking.

Source code in src/raesl/compile/ast/exprs.py
def __init__(self, var_tok: "Token"):
    self.var_tok = var_tok

    self.var_node: Optional["VarNode"] = None

nodes

Node classes representing elementary and combined flows.

CompoundVarNode

CompoundVarNode(
    name_tok: Token,
    the_type: BaseType,
    child_nodes: List[VarNode],
)

Bases: VarNode, DocAddElement

Grouped variable/parameter node.

Parameters:

Name Type Description Default
name_tok Token

Token with the name/position of the variable or parameter.

required
the_type BaseType

Type of the variable or parameter.

required
child_nodes List[VarNode]

Child nodes of the group.

required

Attributes:

Name Type Description
name_index

Mapping of name to the associated VarNode instance.

Source code in src/raesl/compile/ast/nodes.py
def __init__(self, name_tok: "Token", the_type: "BaseType", child_nodes: List[VarNode]):
    super(CompoundVarNode, self).__init__(name_tok, the_type)
    self.child_nodes = child_nodes
    self.name_index = dict((cn.name_tok.tok_text, cn) for cn in child_nodes)

    assert isinstance(self.child_nodes, list)

    # Bundle doesn't have duplicate child names.
    assert len(child_nodes) == len(self.name_index)
add_comment
add_comment(comment_tok: Token)

Compound node doesn't own a store, push comment down to all children.

Source code in src/raesl/compile/ast/nodes.py
def add_comment(self, comment_tok: "Token"):
    """Compound node doesn't own a store, push comment down to all children."""
    for child in self.child_nodes:
        child.add_comment(comment_tok)

ElementaryVarNode

ElementaryVarNode(
    name_tok: Token, the_type: BaseType, counter: Counter
)

Bases: VarNode, DocElement

Elementary variable/parameter node.

Parameters:

Name Type Description Default
name_tok Token

Token with the name/position of the variable or parameter.

required
the_type BaseType

Type of the variable or parameter.

required
counter Counter

Object to give out unique identification numbers, yet be resistant against re-use of imported modules.

required

Attributes:

Name Type Description
id

Unique number of the node, mostly useful for dumps and debugging.

comments List[str]

Stored comments of the node.

Source code in src/raesl/compile/ast/nodes.py
def __init__(self, name_tok: "Token", the_type: "BaseType", counter: "Counter"):
    super(ElementaryVarNode, self).__init__(name_tok, the_type)
    self.id = counter.next()
    self.comments: List[str] = []
add_comment
add_comment(comment_tok: Token)

Add found documentation comment.

Parameters:

Name Type Description Default
comment_tok Token

The raw documentation token to add.

required
Source code in src/raesl/compile/ast/nodes.py
def add_comment(self, comment_tok: "Token"):
    """Add found documentation comment.

    Arguments:
        comment_tok: The raw documentation token to add.
    """
    self.comments.extend(comment_storage.decode_doc_comments(comment_tok))

GroupNode

GroupNode(name_tok: Token, child_nodes: List[Node])

Bases: Node

Class describing content of a variable group. Unlike the VarNode above, a group node has no type of its own.

Parameters:

Name Type Description Default
name_tok Token

Name of the group node.

required
child_nodes List[Node]

Elements of the group.

required
Source code in src/raesl/compile/ast/nodes.py
def __init__(self, name_tok: "Token", child_nodes: List[Node]):
    super(GroupNode, self).__init__(name_tok)
    self.child_nodes = child_nodes

Node

Node(name_tok: Token)

Abstract class for nodes.

Note that a Node only has a name. The typed sub-tree starts with VarNode.

Parameters:

Name Type Description Default
name_tok Token

Name of the node.

required
Source code in src/raesl/compile/ast/nodes.py
def __init__(self, name_tok: "Token"):
    self.name_tok = name_tok

VarNode

VarNode(name_tok: Token, the_type: BaseType)

Bases: Node

Abstract base class of a variable or parameter node that can be shared with variable groups and other users such as transformations and goals.

Parameters:

Name Type Description Default
name_tok Token

Token with the name/location of the variable or parameter.

required
the_type BaseType

Type of the variable or parameter.

required
Source code in src/raesl/compile/ast/nodes.py
def __init__(self, name_tok: "Token", the_type: "BaseType"):
    super(VarNode, self).__init__(name_tok)
    self.the_type = the_type

    assert name_tok.tok_type == "NAME"  # Should be a plain identifier.
add_comment
add_comment(comment_tok: Token)

Add found documentation comment.

Parameters:

Name Type Description Default
comment_tok Token

The raw documentation token to add.

required
Source code in src/raesl/compile/ast/nodes.py
def add_comment(self, comment_tok: "Token"):
    """Add found documentation comment.

    Arguments:
        comment_tok: The raw documentation token to add.
    """
    raise NotImplementedError("Implement me in {}.".format(repr(self)))
get_error_position
get_error_position(name: str) -> int

Return the index in the given string where an error occurs in resolving the node.

Parameters:

Name Type Description Default
name str

Name of the element to find.

required

Returns:

Type Description
int

Approximated index in the string where matching the element fails. Returned value has no meaning if resolving a node succeeds.

Source code in src/raesl/compile/ast/nodes.py
def get_error_position(self, name: str) -> int:
    """Return the index in the given string where an error occurs in resolving the
    node.

    Arguments:
        name: Name of the element to find.

    Returns:
        Approximated index in the string where matching the element fails.
            Returned value has no meaning if resolving a node succeeds.
    """
    raise NotImplementedError("Implement me in {}.".format(repr(self)))
resolve_node
resolve_node(name: str) -> Optional[VarNode]

Find the varparam (sub)node that matches the provided dotted 'name'.

Parameters:

Name Type Description Default
name str

Possibly dotted name that should point at an existing sub-node. The empty string denotes 'self'.

required

Returns:

Type Description
Optional[VarNode]

The node that matches the name, or None if no such node exists. In the latter case, use 'self.get_error_position(name)' to get an indication where the match fails in the name.

Source code in src/raesl/compile/ast/nodes.py
def resolve_node(self, name: str) -> Optional["VarNode"]:
    """Find the varparam (sub)node that matches the provided dotted 'name'.

    Arguments:
        name: Possibly dotted name that should point at an existing sub-node.
            The empty string denotes 'self'.

    Returns:
        The node that matches the name, or None if no such node exists. In the
            latter case, use 'self.get_error_position(name)' to get an indication
            where the match fails in the name.
    """
    raise NotImplementedError("Implement me in {}.".format(repr(self)))

relations

Relation definition and instantiation.

RelationDefParameter

RelationDefParameter(
    name: Token,
    type_name: Token,
    direction: str,
    multi: bool,
)

Parameter of a relation definition.

Parameters:

Name Type Description Default
name Token

Name of the parameter.

required
type_name Token

Name of the type of the parameter.

required
direction str

Direction of the parameter.

required
multi bool

If set, parameter may be specified more than once.

required

Attributes:

Name Type Description
type Optional[BaseType]

Actual type of the parameter.

Source code in src/raesl/compile/ast/relations.py
def __init__(self, name: "Token", type_name: "Token", direction: str, multi: bool):
    self.name = name
    self.type_name = type_name
    self.direction = direction
    self.multi = multi

    self.type: Optional["BaseType"] = None

RelationDefinition

RelationDefinition(name: Token)

Bases: DefaultDocStore

Relation definition.

Parameters:

Name Type Description Default
name Token

Name of the relation definition.

required

Attributes:

Name Type Description
params List[RelationDefParameter]

Parameters of the definition.

Source code in src/raesl/compile/ast/relations.py
def __init__(self, name: "Token"):
    super(RelationDefinition, self).__init__(name)
    self.name = name
    self.params: List[RelationDefParameter] = []

specification

Overall output specification.

Specification

Specification()

Main class.

Attributes:

Name Type Description
types Dict[str, TypeDef]

Types of the specification.

verb_prepos List[VerbPreposDef]

Verbs and pre-positions.

rel_defs List[RelationDefinition]

Relation definitions.

comp_defs List[ComponentDefinition]

Component definitions.

world Optional[ComponentDefinition]

Root component.

Source code in src/raesl/compile/ast/specification.py
def __init__(self):
    self.types: Dict[str, types.TypeDef] = {}
    self.verb_prepos: List["VerbPreposDef"] = []
    self.rel_defs: List["RelationDefinition"] = []

    self.comp_defs: List[components.ComponentDefinition] = []
    self.world: Optional[components.ComponentDefinition] = None

dump

dump(
    spec: Specification,
    output_stream: Optional[TextIO] = None,
)

Dump the provided specification to an output.

Parameters:

Name Type Description Default
spec Specification

Specification to dump.

required
output_stream Optional[TextIO]

Stream to write to, None means stdout.

None
Source code in src/raesl/compile/ast/specification.py
def dump(spec: Specification, output_stream: Optional[TextIO] = None):
    """Dump the provided specification to an output.

    Arguments:
        spec: Specification to dump.
        output_stream: Stream to write to, None means stdout.
    """
    dump_spec = _DumpSpec(output_stream)
    dump_spec.dump(spec)

get_doc_comment_spec_elements

get_doc_comment_spec_elements(
    spec: Specification,
) -> Generator[DocStore, None, None]

Retrieve the specification elements interested in getting documentation comments from the input.

Component definitions are added through components

get_doc_comment_comp_elements

The implementation is highly knowledgeable about data structure here with

respect to the implementation.

Source code in src/raesl/compile/ast/specification.py
def get_doc_comment_spec_elements(
    spec: Specification,
) -> Generator["DocStore", None, None]:
    """Retrieve the specification elements interested in getting documentation comments
    from the input.

    Note: Component definitions are added through components
        get_doc_comment_comp_elements

    Note: The implementation is highly knowledgeable about data structure here with
        respect to the implementation.
    """
    elem: DefaultDocStore
    for elem in spec.verb_prepos:
        assert isinstance(elem, DocStore)
        if elem.doc_tok:
            yield elem

    for elem in spec.rel_defs:
        assert isinstance(elem, DocStore)
        if elem.doc_tok:
            yield elem

unfold_disjunction

unfold_disjunction(expr: Expression) -> List[Expression]

Convert the child expressions of top-level disjunction expressions to a list.

Source code in src/raesl/compile/ast/specification.py
def unfold_disjunction(expr: "Expression") -> List["Expression"]:
    """Convert the child expressions of top-level disjunction expressions to a list."""
    if isinstance(expr, exprs.Disjunction):
        result = []
        for child in expr.childs:
            result.extend(unfold_disjunction(child))
        return result
    else:
        return [expr]

types

AST storage of types.

BaseType

Base class of a type.

get_units
get_units() -> Optional[Set[str]]

Retrieve the units that may be used with values of the type.

Returns:

Type Description
Optional[Set[str]]

Set of unit names, set(['-']) if it has no units specified, or None if the type doesn't support units.

Source code in src/raesl/compile/ast/types.py
def get_units(self) -> Optional[Set[str]]:
    """Retrieve the units that may be used with values of the type.

    Returns:
        Set of unit names, set(['-']) if it has no units specified, or None if the
            type doesn't support units.
    """
    raise NotImplementedError("Implement me in {}.".format(repr(self)))

Compound

Compound(fields: List[CompoundField])

Bases: BaseType

A collection of named typed values. Note that a Compound cannot have parents, units, or intervals.

Parameters:

Name Type Description Default
fields List[CompoundField]

Fields of the compound.

required
Source code in src/raesl/compile/ast/types.py
def __init__(self, fields: List[CompoundField]):
    super(Compound, self).__init__()
    self.fields = fields

CompoundField

CompoundField(name: Token, the_type: BaseType)

A named field in a Compound.

Parameters:

Name Type Description Default
name Token

Name of the compound field.

required
the_type BaseType

Type of the compound field.

required
Source code in src/raesl/compile/ast/types.py
def __init__(self, name: "Token", the_type: BaseType):
    self.name = name
    self.type = the_type

ElementaryType

ElementaryType(
    parent: Optional[ElementaryType],
    units: List[Token],
    intervals: Optional[
        List[Tuple[Optional[Value], Optional[Value]]]
    ],
)

Bases: BaseType

A type of a singular value in ESL. The allowed values in a parent type always have priority over the allowed values in a child type.

Parameters:

Name Type Description Default
parent Optional[ElementaryType]

Parent elementary type if specified.

required
units List[Token]

Allowed units of the type, should not have square brackets around the text.

required
intervals Optional[List[Tuple[Optional[Value], Optional[Value]]]]

Disjunction of allowed ranges of the type, pairs of (lowerbound, upperbound) where one of the bounds may be None. Constants and enumerations are expressed as intervals with the same lower and upper bound.

required
Source code in src/raesl/compile/ast/types.py
def __init__(
    self,
    parent: Optional["ElementaryType"],
    units: List["Token"],
    intervals: Optional[List[Tuple[Optional["Value"], Optional["Value"]]]],
):
    super(ElementaryType, self).__init__()
    self.parent = parent
    self.units = units
    self.intervals = intervals

    assert parent is None or isinstance(parent, ElementaryType)

TypeDef

TypeDef(name: Token, the_type: BaseType)

A named type.

Parameters:

Name Type Description Default
name Token

Name of the type definition.

required
the_type BaseType

Type associated with the name.

required
Source code in src/raesl/compile/ast/types.py
def __init__(self, name: "Token", the_type: BaseType):
    self.name = name
    self.type = the_type

verbs

Verb / preposition definitions in ESL.

VerbPreposDef

VerbPreposDef(verb: Token, prepos: Token)

Bases: DefaultDocStore

A verb and a pre-position definition.

Parameters:

Name Type Description Default
verb Token

Verb token.

required
prepos Token

Pre-position token.

required
Source code in src/raesl/compile/ast/verbs.py
def __init__(self, verb: "Token", prepos: "Token"):
    super(VerbPreposDef, self).__init__(verb)
    self.verb = verb
    self.prepos = prepos

cli

ESL compiler Command Line Interface.

compile

compile(
    paths: List[str], output: Optional[str], force: bool
)

Run the ESL compiler.

Source code in src/raesl/compile/cli.py
@click.command("compile")
@click.argument("paths", nargs=-1, type=click.Path(exists=True, file_okay=True, dir_okay=True))
@click.option(
    "--output",
    "-o",
    default=None,
    type=click.Path(file_okay=True, dir_okay=False),
    help="Graph output file.",
)
@click.option(
    "--force",
    "-f",
    default=False,
    is_flag=True,
    help="Whether to overwrite an existing output file.",
)
def compile(paths: List[str], output: Optional[str], force: bool):
    """Run the ESL compiler."""
    run(*paths, output=output, force=force)

run

run(
    *paths: Union[str, Path],
    output: Optional[Union[str, Path]] = None,
    force: bool = False,
    files: Optional[Union[List[str], List[Path]]] = None
) -> Tuple[
    DiagnosticStore,
    Optional[Specification],
    Optional[Graph],
]

Run the compiler on ESL files.

Parameters:

Name Type Description Default
paths Union[str, Path]

Paths to resolve into ESL files. May be any number of files and directories to scan.

()
output Optional[Union[str, Path]]

Optional output file (JSON) to write the graph to.

None
force bool

Whether to overwrite the output file or raise an error if the file already exists.

False
files Optional[Union[List[str], List[Path]]]

Optional paths argument (deprecated).

None

Returns:

Type Description
DiagnosticStore

Diagnostic storage.

Optional[Specification]

Specification object (if successfully parsed).

Optional[Graph]

Instantiated graph (if successfully instantiated).

Source code in src/raesl/compile/cli.py
def run(
    *paths: Union[str, Path],
    output: Optional[Union[str, Path]] = None,
    force: bool = False,
    files: Optional[Union[List[str], List[Path]]] = None,
) -> Tuple["DiagnosticStore", Optional["Specification"], Optional["Graph"]]:
    """Run the compiler on ESL files.

    Arguments:
        paths: Paths to resolve into ESL files. May be any number of files and
            directories to scan.
        output: Optional output file (JSON) to write the graph to.
        force: Whether to overwrite the output file or raise an error if the file
            already exists.
        files: Optional paths argument (deprecated).

    Returns:
        Diagnostic storage.
        Specification object (if successfully parsed).
        Instantiated graph (if successfully instantiated).
    """
    if files is not None:
        paths = tuple(files)
        msg = " ".join(
            (
                "The 'files' keyword argument will be deprecated.",
                "Please use your file and directory paths as (any number of)",
                "positional arguments to this function.",
                "Also, take a look at 'raesl.compile.to_graph'",
                "or its alias 'ragraph.io.esl.from_esl'. to obtain a Graph.",
            )
        )
        logger.warning(msg)

    try:
        in_files = get_esl_paths(*paths)
        out_file = None if output is None else check_output_path(output, force)
    except ValueError as e:
        if click.get_current_context(silent=True):
            logger.error(str(e))
            sys.exit(1)
        raise e

    # Parse lexers per file.
    diag_store, spec = parser.parse_spec(
        scanner.Lexer(str(f), f.read_text(), 0, 0, 0, []) for f in in_files
    )

    # Errors have been reported to stdout already.
    if diag_store.has_severe() or spec is None:
        if click.get_current_context(silent=True):
            sys.exit(1)
        return diag_store, None, None

    # Succeeded so far: instantiate.
    graph = graph_building.GraphFactory(diag_store=diag_store, spec=spec).make_graph()
    if diag_store.has_severe() or graph is None:
        if click.get_current_context(silent=True):
            sys.exit(1)
        return diag_store, spec, None

    if output is not None:
        from ragraph.io.json import to_json

        to_json(graph, path=out_file)

    return diag_store, spec, graph

diagnostics

ESL compiler Diagnostics.

Diagnostic code scheme:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
Severity:

    E###: ERROR
    W###: WARNING
    I###: INFO
    H###: HINT


Origin:

    #000: General
    #100: Scanning/Parsing
    #200: Typechecking
    #300: AST Builder
    #400: Instance/output builder

DiagnosticStore

DiagnosticStore(
    severity: DiagnosticSeverity = ERROR, exit: bool = False
)

Storage of found diagnostics.

Attributes:

Name Type Description
diagnostics List[EslDiagnostic]

Stored diagnostics.

severity

What diagnostics to log.

exit

Exit on error.

Source code in src/raesl/compile/diagnostics.py
def __init__(self, severity: DiagnosticSeverity = ERROR, exit: bool = False):
    self.diagnostics: List[EslDiagnostic] = []
    self.severity = severity
    self.exit = exit

add

add(diagnostic: EslDiagnostic)

Add a diagnostic. Report directly if below severity threshold.

Source code in src/raesl/compile/diagnostics.py
def add(self, diagnostic: EslDiagnostic):
    """Add a diagnostic. Report directly if below severity threshold."""
    self.diagnostics.append(diagnostic)
    if diagnostic.severity <= self.severity:
        self.report(diagnostic)

dump

dump(test: bool = False, stream: Optional[IO[str]] = None)

Dump all stored diagnostics to the given stream.

Parameters:

Name Type Description Default
test bool

Whether to output in test mode (otherwise: user-friendly).

False
stream Optional[IO[str]]

Output stream to use. Defaults to stdout.

None
Source code in src/raesl/compile/diagnostics.py
def dump(self, test: bool = False, stream: Optional[IO[str]] = None):
    """Dump all stored diagnostics to the given stream.

    Arguments:
        test: Whether to output in test mode (otherwise: user-friendly).
        stream: Output stream to use. Defaults to stdout.
    """
    stream = click.get_text_stream("stdout") if stream is None else stream

    if not test and not self.diagnostics:
        stream.write("No diagnostics to report.")

    for d in self.diagnostics:
        stream.write(str(d))
        stream.write("\n")

has_severe

has_severe() -> bool

Whether there are severe diagnostics stored.

Source code in src/raesl/compile/diagnostics.py
def has_severe(self) -> bool:
    """Whether there are severe diagnostics stored."""
    return any(d.severity not in _NON_SEVERE for d in self.diagnostics)

report

report(diagnostic: EslDiagnostic)

Report a single diagnostic.

Source code in src/raesl/compile/diagnostics.py
def report(self, diagnostic: EslDiagnostic):
    """Report a single diagnostic."""
    message = str(diagnostic)  # Human readable (e.g, 1-based instead of 0-based)

    if diagnostic.severity <= ERROR:
        logger.error(message)
        if self.exit:
            sys.exit(1)
    elif diagnostic.severity <= WARN:
        logger.warning(message)
    else:
        logger.info(message)

EslDiagnostic

EslDiagnostic(
    message: str,
    location: Location = get_location(),
    severity: DiagnosticSeverity = ERROR,
    code: str = "E100",
    source: str = "RaESL compiler",
    related_information: List[
        DiagnosticRelatedInformation
    ] = None,
)

Bases: Diagnostic

An unscoped diagnostic as ESL works with multiple text documents at once.

Parameters:

Name Type Description Default
message str

The diagnostic's message.

required
location Location

The location at which the message applies.

get_location()
severity DiagnosticSeverity

The diagnostic's severity. Can be omitted. If omitted it is up to the client to interpret diagnostics as error, warning, info or hint.

ERROR
code str

The diagnostic's code, which might appear in the user interface.

'E100'
source str

A human-readable string describing the source of this diagnostic, e.g. 'esl', or 'esl compiler'.

'RaESL compiler'
related_information List[DiagnosticRelatedInformation]

A list of related diagnostic information, e.g. when symbol-names within a scope collide you can mark all definitions via this property.

None

Attributes:

Name Type Description
range

The range at which the message applies.

Source code in src/raesl/compile/diagnostics.py
def __init__(
    self,
    message: str,
    location: Location = utils.get_location(),
    severity: DiagnosticSeverity = ERROR,
    code: str = "E100",
    source: str = "RaESL compiler",
    related_information: List[DiagnosticRelatedInformation] = None,
):
    self.location = location
    range = None if location is None else location.range
    super().__init__(message, range, severity, code, source, related_information)

E100

E100(location: Location = get_location()) -> EslDiagnostic

Unexpected end of the specification.

Source code in src/raesl/compile/diagnostics.py
def E100(location: Location = utils.get_location()) -> EslDiagnostic:
    """Unexpected end of the specification."""
    return EslDiagnostic(
        "Unexpected end of the specification.",
        location=location,
        severity=ERROR,
        code="E100",
        source="ESL parser",
    )

E101

E101(
    acceptors: Iterable[str],
    location: Location = get_location(),
) -> EslDiagnostic

Best line match is ambiguous. Found multiple acceptors: {ambi_acceptors}. This is an internal error.

Source code in src/raesl/compile/diagnostics.py
def E101(acceptors: Iterable[str], location: Location = utils.get_location()) -> EslDiagnostic:
    """Best line match is ambiguous. Found multiple acceptors: {ambi_acceptors}. This
    is an internal error.
    """
    enum = "', '".join(acceptors)
    return EslDiagnostic(
        (
            f"Best line match is ambiguous. Found multiple acceptors: '{enum}'. "
            + "This is most likely an internal error."
        ),
        location=location,
        severity=ERROR,
        code="E101",
        source="ESL parser",
    )

E102

E102(location: Location = get_location()) -> EslDiagnostic

Syntax error.

Source code in src/raesl/compile/diagnostics.py
def E102(location: Location = utils.get_location()) -> EslDiagnostic:
    """Syntax error."""
    return EslDiagnostic(
        "Syntax error.",
        location=location,
        severity=ERROR,
        code="E102",
        source="ESL parser",
    )

E200

E200(
    name: str,
    kind: str,
    location: Location = get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic

Multiple {kind} named '{name}'.

Source code in src/raesl/compile/diagnostics.py
def E200(
    name: str,
    kind: str,
    location: Location = utils.get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Multiple {kind} named '{name}'."""
    dupes = [] if dupes is None else dupes
    return EslDiagnostic(
        f"Multiple {kind}s named '{name}'.",
        location=location,
        severity=ERROR,
        code="E200",
        source="ESL typechecker",
        related_information=[EslRelated(dupe, f"Duplicate {kind} '{name}'.") for dupe in dupes],
    )

E201

E201(
    section: str,
    context: str,
    location: Location = get_location(),
) -> EslDiagnostic

This '{section}' section is not allowed in the '{context}' context.

Source code in src/raesl/compile/diagnostics.py
def E201(section: str, context: str, location: Location = utils.get_location()) -> EslDiagnostic:
    """This '{section}' section is not allowed in the '{context}' context."""
    return EslDiagnostic(
        f"This '{section}' section is not allowed in the '{context}' context.",
        location=location,
        severity=ERROR,
        code="E201",
        source="ESL typechecker",
    )

E202

E202(
    kind: str,
    name: Optional[str] = None,
    location: Location = get_location(),
) -> EslDiagnostic

Missing {kind} for '{name}'.

Source code in src/raesl/compile/diagnostics.py
def E202(
    kind: str,
    name: Optional[str] = None,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Missing {kind} for '{name}'."""
    if name is None:
        msg = f"Missing {kind}."
    else:
        msg = f"Missing {kind} for '{name}'."
    return EslDiagnostic(
        msg,
        location=location,
        severity=ERROR,
        code="E202",
        source="ESL typechecker",
    )

E203

E203(
    kind: str,
    name: Optional[str] = None,
    location: Location = get_location(),
) -> EslDiagnostic

Unknown {kind} named '{name}'.

Source code in src/raesl/compile/diagnostics.py
def E203(
    kind: str, name: Optional[str] = None, location: Location = utils.get_location()
) -> EslDiagnostic:
    """Unknown {kind} named '{name}'."""
    if name is None:
        msg = f"Unknown {kind}."
    else:
        msg = f"Unknown {kind} named '{name}'."
    return EslDiagnostic(
        msg,
        location=location,
        severity=ERROR,
        code="E203",
        source="ESL typechecker",
    )

E204

E204(
    name: str,
    kind: str,
    location: Location = get_location(),
    cycle: Optional[List[Location]] = None,
)

Cyclically dependent {kind} named '{name}'.

Source code in src/raesl/compile/diagnostics.py
def E204(
    name: str,
    kind: str,
    location: Location = utils.get_location(),
    cycle: Optional[List[Location]] = None,
):
    """Cyclically dependent {kind} named '{name}'."""
    cycle = [] if cycle is None else cycle
    length = len(cycle)
    return EslDiagnostic(
        f"Cyclically dependent {kind} named '{name}'.",
        location=location,
        severity=ERROR,
        code="E204",
        source="ESL typechecker",
        related_information=[
            EslRelated(entry, f"Cycle {i+1}/{length}.") for i, entry in enumerate(cycle)
        ],
    )

E205

E205(
    name: str,
    context: str,
    location: Location = get_location(),
) -> EslDiagnostic

Cannot find {name} in {context}.

Source code in src/raesl/compile/diagnostics.py
def E205(name: str, context: str, location: Location = utils.get_location()) -> EslDiagnostic:
    """Cannot find {name} in {context}."""
    return EslDiagnostic(
        f"Cannot find {name} in {context}.",
        location=location,
        severity=ERROR,
        code="E205",
        source="ESL typechecker",
    )

E206

E206(
    name: str,
    kind: str,
    location: Location = get_location(),
    blocks: Optional[List[Location]] = None,
) -> EslDiagnostic

Found {kind} block(s), but the relation definition {name} has no such parameters.

Source code in src/raesl/compile/diagnostics.py
def E206(
    name: str,
    kind: str,
    location: Location = utils.get_location(),
    blocks: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Found {kind} block(s), but the relation definition {name} has no such
    parameters.
    """
    blocks = [] if blocks is None else blocks
    return EslDiagnostic(
        (
            f"Found {kind} block(s), "
            + f"but the relation definition {name} has no such parameters."
        ),
        location=location,
        severity=ERROR,
        code="E206",
        source="ESL typechecker",
        related_information=[EslRelated(block, f"Relation {kind} block") for block in blocks],
    )

E207

E207(
    name: str,
    kind: str,
    location: Location = get_location(),
    definition: Location = get_location(),
) -> EslDiagnostic

Relation instance '{name}' is missing a '{kind}' parameters section.

Source code in src/raesl/compile/diagnostics.py
def E207(
    name: str,
    kind: str,
    location: Location = utils.get_location(),
    definition: Location = utils.get_location(),
) -> EslDiagnostic:
    """Relation instance '{name}' is missing a '{kind}' parameters section."""
    return EslDiagnostic(
        f"Relation instance '{name}' is missing a '{kind}' parameters section.",
        location=location,
        severity=ERROR,
        code="E207",
        source="ESL typechecker",
        related_information=[EslRelated(definition, "Corresponding relation definition.")],
    )

E208

E208(
    name: str,
    kind: str,
    num: int,
    location: Location = get_location(),
    definition: Location = get_location(),
) -> EslDiagnostic

Relation instance '{name}' is missing at least {num} '{kind}' parameters.

Source code in src/raesl/compile/diagnostics.py
def E208(
    name: str,
    kind: str,
    num: int,
    location: Location = utils.get_location(),
    definition: Location = utils.get_location(),
) -> EslDiagnostic:
    """Relation instance '{name}' is missing at least {num} '{kind}' parameters."""
    return EslDiagnostic(
        f"Relation instance '{name}' is missing at least {num} '{kind}' parameters.",
        location=location,
        severity=ERROR,
        code="E208",
        source="ESL typechecker",
        related_information=[EslRelated(definition, "Corresponding relation definition.")],
    )

E209

E209(
    name: str,
    kind: str,
    other_kind: str,
    location: Location = get_location(),
    others: Optional[List[Location]] = None,
) -> EslDiagnostic

'{name}' is both a {kind} and a {other_kind}.

Source code in src/raesl/compile/diagnostics.py
def E209(
    name: str,
    kind: str,
    other_kind: str,
    location: Location = utils.get_location(),
    others: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """'{name}' is both a {kind} and a {other_kind}."""
    others = [] if others is None else others
    return EslDiagnostic(
        f"'{name}' is both a {kind} and a {other_kind}.",
        location=location,
        severity=ERROR,
        code="E209",
        source="ESL typechecker",
        related_information=[EslRelated(other, f"{other_kind} location.") for other in others],
    )

E210

E210(
    lhs: Location,
    rhs: Location,
    reason: str = "are not compatible",
) -> EslDiagnostic

Values cannot be compared, they {reason}.

Source code in src/raesl/compile/diagnostics.py
def E210(lhs: Location, rhs: Location, reason: str = "are not compatible") -> EslDiagnostic:
    """Values cannot be compared, they {reason}."""
    return EslDiagnostic(
        f"Values cannot be compared, they {reason}.",
        location=lhs,
        severity=ERROR,
        code="E210",
        source="ESL typechecker",
        related_information=[EslRelated(rhs, "Other value location.")],
    )

E211

E211(
    verb: str,
    preposition: str,
    location: Location = get_location(),
) -> EslDiagnostic

Unsupported verb-preposition combination '{verb} {preposition}'.

Source code in src/raesl/compile/diagnostics.py
def E211(
    verb: str,
    preposition: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Unsupported verb-preposition combination '{verb} {preposition}'."""
    return EslDiagnostic(
        f"Unsupported verb-preposition combination '{verb} {preposition}'.",
        location=location,
        severity=ERROR,
        code="E211",
        source="ESL typechecker",
    )

E212

E212(
    kind: str,
    value: str,
    allowed: str,
    name: Optional[str] = None,
    location: Location = get_location(),
) -> EslDiagnostic

{kind.capitalize()} '{name}' uses '{value}', but should use {allowed}.

Source code in src/raesl/compile/diagnostics.py
def E212(
    kind: str,
    value: str,
    allowed: str,
    name: Optional[str] = None,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """{kind.capitalize()} '{name}' uses '{value}', but should use {allowed}."""
    if name is None:
        msg = f"{kind.capitalize()} uses '{value}', but should use {allowed}."
    else:
        msg = f"{kind.capitalize()} '{name}' uses '{value}', but should use {allowed}."
    return EslDiagnostic(
        msg,
        location=location,
        severity=ERROR,
        code="E212",
        source="ESL typechecker",
    )

E213

E213(
    kind: str,
    num: int,
    allowed: str,
    location: Location = get_location(),
    occurrences: Optional[List[Location]] = None,
) -> EslDiagnostic

Found {num} {kind}(s), but there should be {allowed}.

Source code in src/raesl/compile/diagnostics.py
def E213(
    kind: str,
    num: int,
    allowed: str,
    location: Location = utils.get_location(),
    occurrences: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Found {num} {kind}(s), but there should be {allowed}."""
    occurrences = [] if occurrences is None else occurrences
    return EslDiagnostic(
        f"Found {num} {kind}(s), but there should be {allowed}.",
        location=location,
        severity=ERROR,
        code="E213",
        source="ESL typechecker",
        related_information=[EslRelated(occ, f"{kind.capitalize()}") for occ in occurrences],
    )

E214

E214(
    name: str,
    location: Location = get_location(),
    def_location: Optional[Location] = None,
) -> EslDiagnostic

Definition of type '{name}' failed with an error.

Source code in src/raesl/compile/diagnostics.py
def E214(
    name: str,
    location: Location = utils.get_location(),
    def_location: Optional[Location] = None,
) -> EslDiagnostic:
    """Definition of type '{name}' failed with an error."""
    def_locs = [] if def_location is None else [def_location]
    return EslDiagnostic(
        f"Definition of type '{name}' failed with an error.",
        location=location,
        severity=ERROR,
        code="E214",
        source="ESL typechecker",
        related_information=[EslRelated(loc, "Related type definition.") for loc in def_locs],
    )

E215

E215(
    name: str, location: Location = get_location()
) -> EslDiagnostic

Type name '{name}' must be the name of an elementary type.

Source code in src/raesl/compile/diagnostics.py
def E215(
    name: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Type name '{name}' must be the name of an elementary type."""
    return EslDiagnostic(
        f"Type name '{name}' must be the name of an elementary type.",
        location=location,
        severity=ERROR,
        code="E215",
        source="ESL typechecker",
    )

E216

E216(
    name: str, location: Location = get_location()
) -> EslDiagnostic

Unit '{name}' should not have square brackets around it's name.

Source code in src/raesl/compile/diagnostics.py
def E216(
    name: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Unit '{name}' should not have square brackets around it's name."""
    return EslDiagnostic(
        f"Unit '{name}' should not have square brackets around it's name.",
        location=location,
        severity=ERROR,
        code="E216",
        source="ESL typechecker",
    )

E217

E217(location: Location = get_location()) -> EslDiagnostic

The dimensionless unit '-' is not allowed to be specified explicitly.

Source code in src/raesl/compile/diagnostics.py
def E217(
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """The dimensionless unit '-' is not allowed to be specified explicitly."""
    return EslDiagnostic(
        "The dimensionless unit '-' is not allowed to be specified explicitly.",
        location=location,
        severity=ERROR,
        code="E217",
        source="ESL typechecker",
    )

E218

E218(
    name: str, location: Location = get_location()
) -> EslDiagnostic

Standard type '{name}' cannot be overridden.

Source code in src/raesl/compile/diagnostics.py
def E218(name: str, location: Location = utils.get_location()) -> EslDiagnostic:
    """Standard type '{name}' cannot be overridden."""
    return EslDiagnostic(
        f"Standard type '{name}' cannot be overridden.",
        location=location,
        severity=ERROR,
        code="E218",
        source="ESL typechecker",
    )

E219

E219(
    name: str, location: Location = get_location()
) -> EslDiagnostic

Unit '{name}' is not allowed here.

Source code in src/raesl/compile/diagnostics.py
def E219(name: str, location: Location = utils.get_location()) -> EslDiagnostic:
    """Unit '{name}' is not allowed here."""
    return EslDiagnostic(
        f"Unit '{name}' is not allowed here.",
        location=location,
        severity=ERROR,
        code="E219",
        source="ESL typechecker",
    )

E220

E220(
    name: str,
    kind: str,
    location: Location = get_location(),
) -> EslDiagnostic

Element '{name}' does not match with a {kind}.

Source code in src/raesl/compile/diagnostics.py
def E220(name: str, kind: str, location: Location = utils.get_location()) -> EslDiagnostic:
    """Element '{name}' does not match with a {kind}."""
    return EslDiagnostic(
        f"Element '{name}' does not match with a {kind}.",
        location=location,
        severity=ERROR,
        code="E220",
        source="ESL typechecker",
    )

E221

E221(
    kind: str,
    num: int,
    expected: int,
    location: Location = get_location(),
    references: Optional[List[Location]] = None,
) -> EslDiagnostic

Number of {kind}s does not match. Found {num}, expected {expected}.

Source code in src/raesl/compile/diagnostics.py
def E221(
    kind: str,
    num: int,
    expected: int,
    location: Location = utils.get_location(),
    references: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Number of {kind}s does not match. Found {num}, expected {expected}."""
    references = [] if references is None else references
    return EslDiagnostic(
        f"Number of {kind}s does not match. Found {num}, expected {expected}.",
        location=location,
        severity=ERROR,
        code="E221",
        source="ESL typechecker",
        related_information=[
            EslRelated(ref, f"Reference with {expected} {kind}(s).") for ref in references
        ],
    )

E222

E222(
    name: str,
    other: str,
    location: Location = get_location(),
    other_loc: Location = get_location(),
) -> EslDiagnostic

Value '{name}' has additional value restrictions relative to '{other}'.

Source code in src/raesl/compile/diagnostics.py
def E222(
    name: str,
    other: str,
    location: Location = utils.get_location(),
    other_loc: Location = utils.get_location(),
) -> EslDiagnostic:
    """Value '{name}' has additional value restrictions relative to '{other}'."""
    return EslDiagnostic(
        f"Value '{name}' has additional value restrictions relative to '{other}'.",
        location=location,
        severity=ERROR,
        code="E222",
        source="ESL typechecker",
        related_information=[EslRelated(other_loc, f"Other value '{other}'.")],
    )

E223

E223(
    name: str,
    other: str,
    kind: str,
    location: Location = get_location(),
    other_loc: Optional[Location] = None,
) -> EslDiagnostic

'{name}' is not a {kind} of {other}.

Source code in src/raesl/compile/diagnostics.py
def E223(
    name: str,
    other: str,
    kind: str,
    location: Location = utils.get_location(),
    other_loc: Optional[Location] = None,
) -> EslDiagnostic:
    """'{name}' is not a {kind} of {other}."""
    related = [] if other_loc is None else [EslRelated(other_loc, f"{other}")]
    return EslDiagnostic(
        f"'{name}' is not a {kind} of {other}.",
        location=location,
        severity=ERROR,
        code="E223",
        source="ESL typechecker",
        related_information=related,
    )

E224

E224(
    kind: str,
    unsupported: str,
    location: Location = get_location(),
) -> EslDiagnostic

{kind.capitalize()}s do not support {unsupported}.

Source code in src/raesl/compile/diagnostics.py
def E224(
    kind: str,
    unsupported: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """{kind.capitalize()}s do not support {unsupported}."""
    return EslDiagnostic(
        f"{kind.capitalize()}s do not support {unsupported}.",
        location=location,
        severity=ERROR,
        code="E224",
        source="ESL typechecker",
    )

E225

E225(
    part: str,
    first_part: str,
    kind: str,
    location: Location = get_location(),
) -> EslDiagnostic

Cannot resolve '.{part}' part of the '{first_part}' {kind}.

Source code in src/raesl/compile/diagnostics.py
def E225(
    part: str,
    first_part: str,
    kind: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Cannot resolve '.{part}' part of the '{first_part}' {kind}."""
    return EslDiagnostic(
        f"Cannot resolve '.{part}' part of the '{first_part}' {kind}.",
        location=location,
        severity=ERROR,
        code="E225",
        source="ESL typechecker",
    )

E226

E226(
    name: str, location: Location = get_location()
) -> EslDiagnostic

Need '{name}' is not allowed to reference a bundle.

Source code in src/raesl/compile/diagnostics.py
def E226(
    name: str,
    location: Location = utils.get_location(),
) -> EslDiagnostic:
    """Need '{name}' is not allowed to reference a bundle."""
    return EslDiagnostic(
        f"Need '{name}' is not allowed to reference a bundle.",
        location=location,
        severity=ERROR,
        code="E226",
        source="ESL typechecker",
        related_information=[EslRelated(location, "Only elementary variables, try its fields.")],
    )

E227

E227(
    name: str,
    scope: str,
    location: Location = get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic

Multiple identifier '{name}' within '{cdef_name}'.

Source code in src/raesl/compile/diagnostics.py
def E227(
    name: str,
    scope: str,
    location: Location = utils.get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Multiple identifier '{name}' within '{cdef_name}'."""
    dupes = [] if dupes is None else dupes
    return EslDiagnostic(
        f"Duplicate identifier '{name}' within the scope of '{scope}'.",
        location=location,
        severity=ERROR,
        code="E200",
        source="ESL typechecker",
        related_information=[EslRelated(dupe, f"Duplicate identifier '{name}'.") for dupe in dupes],
    )

E228

E228(
    lhs: Location,
    rhs: Location,
    reason: str = "are not compatible",
) -> EslDiagnostic

Values cannot be compared, design rule {reason}.

Source code in src/raesl/compile/diagnostics.py
def E228(lhs: Location, rhs: Location, reason: str = "are not compatible") -> EslDiagnostic:
    """Values cannot be compared, design rule {reason}."""
    return EslDiagnostic(
        f"Values cannot be compared, design rule {reason}.",
        location=lhs,
        severity=ERROR,
        code="E228",
        source="ESL typechecker",
    )

E400

E400(
    name: str,
    location: Location = get_location(),
    owners: Optional[List[Location]] = None,
) -> EslDiagnostic

Elementary variable value '{name}' has more than one property owner.

Source code in src/raesl/compile/diagnostics.py
def E400(
    name: str,
    location: Location = utils.get_location(),
    owners: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Elementary variable value '{name}' has more than one property owner."""
    owners = [] if owners is None else owners
    return EslDiagnostic(
        f"Elementary variable value '{name}' has more than one property owner.",
        location=location,
        severity=ERROR,
        code="E400",
        source="ESL instantiating",
        related_information=[EslRelated(owner, "Duplicate owner.") for owner in owners],
    )

W200

W200(
    name: str,
    kind: str,
    location: Location = get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic

{kind.capitalize()} '{name}' has been specified multiple times.

Source code in src/raesl/compile/diagnostics.py
def W200(
    name: str,
    kind: str,
    location: Location = utils.get_location(),
    dupes: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """{kind.capitalize()} '{name}' has been specified multiple times."""
    dupes = [] if dupes is None else dupes
    return EslDiagnostic(
        f"{kind.capitalize()} '{name}' has been specified multiple times.",
        location=location,
        severity=WARN,
        code="W200",
        source="ESL typechecker",
        related_information=[EslRelated(dupe, f"Duplicate {kind}.") for dupe in dupes],
    )

W300

W300(
    element: Optional[str] = None,
    location: Location = get_location(),
    comments: Optional[List[Location]] = None,
) -> EslDiagnostic

Documentation comment(s) could not be assigned to '{element}'.

Source code in src/raesl/compile/diagnostics.py
def W300(
    element: Optional[str] = None,
    location: Location = utils.get_location(),
    comments: Optional[List[Location]] = None,
) -> EslDiagnostic:
    """Documentation comment(s) could not be assigned to '{element}'."""
    element = "a specification element" if element is None else f"'{element}'"
    comments = [] if comments is None else comments
    return EslDiagnostic(
        f"Documentation comment(s) could not be assigned to {element}.",
        location=location,
        severity=WARN,
        code="W300",
        source="ESL AST builder",
        related_information=[
            EslRelated(doc, "Unassigned documentation comment.") for doc in comments
        ],
    )

esl_lines

State machines to recognize lines of ESL.

get_all_line_machines

get_all_line_machines() -> List[StateMachine]

Get all line state machines (for testing).

Source code in src/raesl/compile/esl_lines.py
def get_all_line_machines() -> List[StateMachine]:
    """Get all line state machines (for testing)."""
    machines: List[StateMachine] = list(machine_files.collect_line_machines().values())
    return machines

get_line_machine

get_line_machine(name: str) -> ProcessingStateMachine

Retrieve a line matcher state machine by name.

Source code in src/raesl/compile/esl_lines.py
def get_line_machine(name: str) -> machine_files.builder.ProcessingStateMachine:
    """Retrieve a line matcher state machine by name."""
    machine = machine_files.collect_line_machines()[name]
    return machine

get_line_machine_names

get_line_machine_names() -> Set[str]

Get the name of all line state machines (for testing).

Source code in src/raesl/compile/esl_lines.py
def get_line_machine_names() -> Set[str]:
    """Get the name of all line state machines (for testing)."""
    names = set(machine_files.collect_line_machines().keys())
    return names

instantiating

Instantiating module to build the output graph.

edge_building

Edge building for the instantiated output graph.

Reference: https://ratio-case.gitlab.io/docs/reference/esl_reference/dependency-derivations.html

EdgeFactory

EdgeFactory(
    diag_store: DiagnosticStore,
    node_store: Optional[NodeStore] = None,
)
Source code in src/raesl/compile/instantiating/edge_building.py
def __init__(
    self,
    diag_store: diagnostics.DiagnosticStore,
    node_store: Optional[NodeStore] = None,
):
    self.diag_store = diag_store
    self.node_store = node_store

    # Set categories dynamically.
    EdgeStore.categories = ["edges"] + [i[6:] for i in dir(self) if i.startswith("_make_")]
    self.edge_store = EdgeStore()
make_edges
make_edges(
    node_store: Optional[NodeStore] = None,
) -> List[Edge]

Derive edges from a :obj:NodeStore object.

Source code in src/raesl/compile/instantiating/edge_building.py
def make_edges(self, node_store: Optional[NodeStore] = None) -> List[Edge]:
    """Derive edges from a :obj:`NodeStore` object."""
    # Clear edge lists
    self.edge_store.clear()

    # Set provided node_store or try an keep the current one.
    self.node_store = self.node_store if node_store is None else node_store
    if self.node_store is None:
        return list()

    for cat in self.edge_store.categories:
        # Make edges if list is empty. This guarantees each category is only
        # generated once regardless of them calling eachother internally.
        if cat == "edges":
            continue
        self._make(cat)

    return self.edge_store.edges

EdgeStore

EdgeStore()

Edge storage with multiple categories for quicker access to specific subsets.

Source code in src/raesl/compile/instantiating/edge_building.py
def __init__(self):
    for cat in self.categories:
        setattr(self, cat, list())
add
add(edge: Edge, *args)

Add node to :obj:self.nodes and any other specified maps in args.

Source code in src/raesl/compile/instantiating/edge_building.py
def add(self, edge: Edge, *args):
    """Add node to :obj:`self.nodes` and any other specified maps in args."""
    self.edges.append(edge)
    for m in args:
        getattr(self, m).append(edge)
    logger.debug(f"Added ragraph.edge.Edge {edge} to edges and {args}.")
clear
clear()

Clear all edge categories.

Source code in src/raesl/compile/instantiating/edge_building.py
def clear(self):
    """Clear all edge categories."""
    for cat in self.categories:
        getattr(self, cat).clear()
consume
consume(edges: Iterable[Edge], *args)

Add any edges from an iterable to given categories.

Source code in src/raesl/compile/instantiating/edge_building.py
def consume(self, edges: Iterable[Edge], *args):
    """Add any edges from an iterable to given categories."""
    for e in edges:
        self.add(e, *args)

graph_building

Functions for instantiating the component tree.

GraphFactory

GraphFactory(
    diag_store: DiagnosticStore,
    spec: Optional[Specification] = None,
)

Graph factory class.

Converts a specification into a graph containing a node hierarchy and edges for derived dependencies between nodes.

Parameters:

Name Type Description Default
diag_store DiagnosticStore

Storage for found diagnostics during the process.

required

Attributes:

Name Type Description
node_factory

Factory that parses spec into Node objects.

edge_factory

Factory that derives edges from found Node objects.

Source code in src/raesl/compile/instantiating/graph_building.py
def __init__(
    self,
    diag_store: diagnostics.DiagnosticStore,
    spec: Optional[Specification] = None,
):
    self.diag_store = diag_store
    self.spec = spec
    self.node_factory = NodeFactory(self.diag_store, spec=spec)
    self.edge_factory = EdgeFactory(self.diag_store, node_store=self.node_factory.node_store)
make_graph
make_graph(
    spec: Optional[Specification] = None,
) -> Optional[Graph]

Instantiate the tree defined in the specification, and build a graph for it.

Parameters:

Name Type Description Default
spec Optional[Specification]

Specification object holding parsed ESL data.

None

Returns:

Type Description
Optional[Graph]

None if no root is available, else the constructed graph.

Note

Problems may be reported during instantiation and added to self.diag_store.

Source code in src/raesl/compile/instantiating/graph_building.py
def make_graph(self, spec: Optional[Specification] = None) -> Optional[Graph]:
    """Instantiate the tree defined in the specification, and build a graph for it.

    Arguments:
        spec: Specification object holding parsed ESL data.

    Returns:
        None if no root is available, else the constructed graph.

    Note:
        Problems may be reported during instantiation and added to self.diag_store.
    """
    self.spec = self.spec if spec is None else spec
    if self.spec is None or self.spec.world is None:
        return None

    node_dict = self.node_factory.make_nodes(self.spec)
    edges = self.edge_factory.make_edges()

    return Graph(nodes=node_dict.values(), edges=edges)

graph_data

Classes for the instantiated component graph.

InstNode

InstNode(name: str, variable: VarParam)

Instance node that connects one or more elementary nodes that are connected through parameters.

Parameters:

Name Type Description Default
name str

Dotted variable name of the associated elementary type.

required
variable VarParam

Variable that created the node.

required

Attributes:

Name Type Description
number

Unique number for each instance nodes, mostly useful for debugging.

params List[VarParam]

Parameters connected to the variable through this node.

comments List[str]

Comments from the connected variable and parameters.

Source code in src/raesl/compile/instantiating/graph_data.py
def __init__(self, name: str, variable: "VarParam"):
    self.number = InstNode.next_num
    InstNode.next_num = self.number + 1

    self.name = name
    self.variable = variable
    self.params: List["VarParam"] = []
    self.comments: List[str] = []

node_building

Methods for casting ESL AST into ragraph.node.Node objects.

NodeFactory

NodeFactory(
    diag_store: DiagnosticStore,
    spec: Optional[Specification] = None,
)

Node factory. Creates :obj:Node objects from a :obj:Specification.

Source code in src/raesl/compile/instantiating/node_building.py
def __init__(
    self,
    diag_store: diagnostics.DiagnosticStore,
    spec: Optional[Specification] = None,
):
    self.diag_store = diag_store
    self.spec = spec
    self.node_store = NodeStore()
    self.type_inst_map = {}
make_nodes
make_nodes(
    spec: Optional[Specification] = None,
) -> Dict[str, Node]

Instantiate AST and create :obj:Node objects accordingly.

Source code in src/raesl/compile/instantiating/node_building.py
def make_nodes(self, spec: Optional[Specification] = None) -> Dict[str, Node]:
    """Instantiate AST and create :obj:`Node` objects accordingly."""
    # Clear maps.
    self.node_store.clear()

    # Set provided spec or try and keep the current one.
    self.spec = self.spec if spec is None else spec
    if self.spec is None:
        return dict()

    # Create types first.
    for t in self.spec.types.values():
        self.type_inst_map[t.type] = t
        tn = make_type_node(t)
        if tn is not None:
            self._add(tn, "types")

    # Create world node and instantiate it (recursively).
    self._add(
        Node(
            name="world",
            kind="component",
            annotations=dict(
                esl_info=dict(definition_name="world", property_variables=[], comments=[])
            ),
        ),
        "components",
    )
    self._instantiate_component(spec.world, {}, "world")

    post_process_comments(self.node_store.nodes.values())

    return self.node_store.nodes

NodeStore

NodeStore()

Node storage with multiple catagories for quicker access to specific subsets.

Source code in src/raesl/compile/instantiating/node_building.py
def __init__(self):
    for cat in self.categories:
        setattr(self, cat, dict())
add
add(node: Node, *args)

Add node to :obj:self.nodes and any other specified categories in args.

Source code in src/raesl/compile/instantiating/node_building.py
def add(self, node: Node, *args):
    """Add node to :obj:`self.nodes` and any other specified categories in args."""
    name = node.name
    self.nodes[name] = node
    for m in args:
        getattr(self, m)[name] = node
    logger.debug(f"Added ragraph.node.Node '{node.name}' to nodes and {args}.")
clear
clear()

Clear all node categories.

Source code in src/raesl/compile/instantiating/node_building.py
def clear(self):
    """Clear all node categories."""
    for cat in self.categories:
        getattr(self, cat).clear()

make_behavior_node

make_behavior_node(
    b: BehaviorFunction,
    inst_name: str,
    inst_map: Dict[ElementaryVarNode, InstNode],
) -> Node

Behavior spec node creation.

Parameters:

Name Type Description Default
b BehaviorFunction

The behavior specification for which a node must be created.

required
inst_name str

The instantiation name of the component.

required
inst_map Dict[ElementaryVarNode, InstNode]

Dictionary containing the instantiated variables.

required

Returns:

Type Description
Node

Node of kind "behavior_spec".

Source code in src/raesl/compile/instantiating/node_building.py
def make_behavior_node(
    b: BehaviorFunction, inst_name: str, inst_map: Dict[ElementaryVarNode, InstNode]
) -> Node:
    """Behavior spec node creation.

    Arguments:
      b: The behavior specification for which a node must be created.
      inst_name: The instantiation name of the component.
      inst_map: Dictionary containing the instantiated variables.

    Returns:
      Node of kind "behavior_spec".
    """
    n = inst_name + "." + b.name_tok.tok_text
    a = _make_behavior_annotations(b=b, inst_map=inst_map)
    return Node(name=n, kind="behavior_spec", annotations=a)

make_component_node

make_component_node(
    c: ComponentInstance,
    inst_name: str,
    params: Dict[ElementaryVarNode, InstNode],
) -> Node

Node creation for a component.

Parameters:

Name Type Description Default
c ComponentInstance

The component for which a node is created.

required
inst_name str

The instantiation name of the node.

required
params Dict[ElementaryVarNode, InstNode]

List of parameters of the component.

required

Returns:

Type Description
Node

Node of kind "component"

Source code in src/raesl/compile/instantiating/node_building.py
def make_component_node(
    c: ComponentInstance, inst_name: str, params: Dict[ElementaryVarNode, InstNode]
) -> Node:
    """Node creation for a component.

    Arguments:
        c: The component for which a node is created.
        inst_name: The instantiation name of the node.
        params: List of parameters of the component.

    Returns:
        Node of kind "component"
    """
    a = _make_component_annotations(c=c, params=params)
    return Node(name=inst_name, kind="component", annotations=a)

make_design_node

make_design_node(
    d: Design,
    inst_name: str,
    inst_map: Dict[ElementaryVarNode, InstNode],
) -> Node

Design spec node creation.

Parameters:

Name Type Description Default
d Design

The design specification for which a node must be created.

required
inst_name str

The instantiation name of the component.

required
inst_map Dict[ElementaryVarNode, InstNode]

Dictionary containing the instantiated variables.

required

Returns:

Type Description
Node

Node of kind "design_spec".

Source code in src/raesl/compile/instantiating/node_building.py
def make_design_node(
    d: Design, inst_name: str, inst_map: Dict[ElementaryVarNode, InstNode]
) -> Node:
    """Design spec node creation.

    Arguments:
      d: The design specification for which a node must be created.
      inst_name: The instantiation name of the component.
      inst_map: Dictionary containing the instantiated variables.

    Returns:
      Node of kind "design_spec".
    """
    n = inst_name + "." + d.label_tok.tok_text
    a = _make_design_annotations(d=d, inst_map=inst_map)
    return Node(name=n, kind="design_spec", annotations=a)

make_goal_node

make_goal_node(
    g: Goal,
    inst_name: str,
    inst_map: Dict[ElementaryVarNode, InstNode],
) -> Node

Goal node creation.

Parameters:

Name Type Description Default
g Goal

The goal for which a node must be created.

required
inst_name str

The instantiation name of the component.

required
inst_map Dict[ElementaryVarNode, InstNode]

Dictionary containing the instantiated variables.

required

Returns:

Type Description
Node

Node of kind "function".

Source code in src/raesl/compile/instantiating/node_building.py
def make_goal_node(g: Goal, inst_name: str, inst_map: Dict[ElementaryVarNode, InstNode]) -> Node:
    """Goal node creation.

    Arguments:
      g: The goal for which a node must be created.
      inst_name: The instantiation name of the component.
      inst_map: Dictionary containing the instantiated variables.

    Returns:
      Node of kind "function".
    """
    name = inst_name + "." + g.label_tok.tok_text
    a = _make_goal_annotations(g=g, inst_name=inst_name, inst_map=inst_map)
    return Node(name=name, kind="function_spec", annotations=a)

make_need_node

make_need_node(n: Need, inst_name: str) -> Node

Node creation for a component.

Parameters:

Name Type Description Default
n Need

The need for which a node must be created.

required
inst_name str

The instation name of the need.

required

Returns:

Type Description
Node

Node of kind "need"

Source code in src/raesl/compile/instantiating/node_building.py
def make_need_node(n: Need, inst_name: str) -> Node:
    """Node creation for a component.

    Arguments:
      n: The need for which a node must be created.
      inst_name: The instation name of the need.

    Returns:
      Node of kind "need"
    """
    name = f"{inst_name}.{n.label_tok.tok_text}"
    a = _make_need_annotations(n=n, inst_prefix=inst_name)
    return Node(name=name, kind="need", annotations=a)

make_parameter_instmap

make_parameter_instmap(
    param: VarParam,
    param_node: VarNode,
    arg_node: VarNode,
    parent_inst_map: Dict[ElementaryVarNode, InstNode],
) -> Dict[ElementaryVarNode, InstNode]

Construct an inst node map for a parameter of a child component instance.

Parameters:

Name Type Description Default
param VarParam

Parameter definition in the child component definition.

required
param_node VarNode

VarNode within the parameter in the child component definition. Note these are like variables in the child.

required
arg_node VarNode

Node in the parent that must match with param_node. These nodes exist in the parent component, and may contain part of a variable group.

required
parent_inst_map Dict[ElementaryVarNode, InstNode]

Variable instance map of the parent. As component instantiation is recursive, this may also include nodes from the grand-parent or higher. Should not be modified.

required

Returns:

Type Description
Dict[ElementaryVarNode, InstNode]

Instance node map for a parameter.

Source code in src/raesl/compile/instantiating/node_building.py
def make_parameter_instmap(
    param: VarParam,
    param_node: VarNode,
    arg_node: VarNode,
    parent_inst_map: Dict[ElementaryVarNode, InstNode],
) -> Dict[ElementaryVarNode, InstNode]:
    """Construct an inst node map for a parameter of a child component instance.

    Arguments:
        param: Parameter definition in the child component definition.
        param_node: VarNode within the parameter in the child component
            definition. Note these are like variables in the child.
        arg_node: Node in the parent that must match with param_node.
            These nodes exist in the parent component, and may contain part of a
            variable group.
        parent_inst_map: Variable instance map of the parent. As component
            instantiation is recursive, this may also include nodes from the
            grand-parent or higher. Should not be modified.

    Returns:
        Instance node map for a parameter.
    """
    if isinstance(param_node, ElementaryVarNode):
        # Elementary parameter node. At this point, the arg_node should also be
        # elementary, and exist as instance node representing part of a variable
        # in the # parent.
        # Link that instance node to the param node in the child.
        assert isinstance(arg_node, ElementaryVarNode)

        instnode = parent_inst_map.get(arg_node)
        assert instnode is not None

        instnode.add_param(param)

        node = param.resolve_element(param.name_tok.tok_text)
        if isinstance(node, ElementaryVarNode):
            instnode.add_comment(node.get_comment())
        elif isinstance(node, CompoundVarNode):
            child = node.resolve_node(instnode.name.split(".")[-1])
            if child:
                instnode.add_comment(child.get_comment())
        return {param_node: instnode}
    else:
        child_map = {}
        assert isinstance(param_node, CompoundVarNode)
        if isinstance(arg_node, CompoundVarNode):
            assert len(param_node.child_nodes) == len(arg_node.child_nodes)
            for param_child, arg_child in zip(param_node.child_nodes, arg_node.child_nodes):
                child_map.update(
                    make_parameter_instmap(param, param_child, arg_child, parent_inst_map)
                )
        else:
            assert isinstance(arg_node, GroupNode)
            assert len(param_node.child_nodes) == len(arg_node.child_nodes)
            for param_child, arg_child in zip(param_node.child_nodes, arg_node.child_nodes):
                child_map.update(
                    make_parameter_instmap(param, param_child, arg_child, parent_inst_map)
                )

        return child_map

make_relation_node

make_relation_node(
    r: RelationInstance,
    inst_name: str,
    inst_map: Dict[ElementaryVarNode, InstNode],
) -> Node

Relation spec node creation.

Parameters:

Name Type Description Default
r RelationInstance

The relation specification for which a node must be created.

required
inst_name str

The instantiation name of the component.

required
inst_map Dict[ElementaryVarNode, InstNode]

Dictionary containing the instantiated variables.

required

Returns:

Type Description
Node

Node of kind "relation_spec".

Source code in src/raesl/compile/instantiating/node_building.py
def make_relation_node(
    r: RelationInstance, inst_name: str, inst_map: Dict[ElementaryVarNode, InstNode]
) -> Node:
    """Relation spec node creation.

    Arguments:
      r: The relation specification for which a node must be created.
      inst_name: The instantiation name of the component.
      inst_map: Dictionary containing the instantiated variables.

    Returns:
      Node of kind "relation_spec".
    """
    n = inst_name + "." + r.inst_name_tok.tok_text
    a = _make_relation_annotations(r, inst_map)
    return Node(name=n, kind="relation_spec", annotations=a)

make_transform_node

make_transform_node(
    t: Transformation,
    inst_name: str,
    inst_map: Dict[ElementaryVarNode, InstNode],
) -> Node

Transformation node creation.

Parameters:

Name Type Description Default
t Transformation

The transformation for which a node must be created.

required
inst_name str

The instantiation name of the component.

required
inst_map Dict[ElementaryVarNode, InstNode]

Dictionary containing the instantiated variables.

required

Returns:

Type Description
Node

Node of kind "function_spec".

Source code in src/raesl/compile/instantiating/node_building.py
def make_transform_node(
    t: Transformation, inst_name: str, inst_map: Dict[ElementaryVarNode, InstNode]
) -> Node:
    """Transformation node creation.

    Arguments:
      t: The transformation for which a node must be created.
      inst_name: The instantiation name of the component.
      inst_map: Dictionary containing the instantiated variables.

    Returns:
      Node of kind "function_spec".
    """
    name = inst_name + "." + t.label_tok.tok_text
    a = _make_transform_annotations(t=t, inst_name=inst_name, inst_map=inst_map)
    return Node(name=name, kind="function_spec", annotations=a)

make_type_node

make_type_node(tdef: TypeDef) -> Node

Node creation for a type definition.

Parameters:

Name Type Description Default
tdef TypeDef

Type definition for which a node most be created.

required

Returns:

Type Description
Node

Node of kind variabel_type

Source code in src/raesl/compile/instantiating/node_building.py
def make_type_node(tdef: types.TypeDef) -> Node:
    """Node creation for a type definition.

    Arguments:
      tdef: Type definition for which a node most be created.

    Returns:
      Node of kind `variabel_type`
    """
    a = _make_type_annotations(tdef=tdef)
    if not a:
        return
    return Node(name=tdef.name.tok_text, kind="variable_type", annotations=a)

make_variable_instmap

make_variable_instmap(
    var: VarParam, varname: List[str], node: VarNode
) -> Dict[ElementaryVarNode, InstNode]

Construct instance nodes for the provided variable.

Parameters:

Name Type Description Default
var VarParam

Variable represented by the node.

required
varname List[str]

Collected prefixes of the dotted name so far.

required
node VarNode

Node to associate with one or more inst nodes.

required

Returns:

Type Description
Dict[ElementaryVarNode, InstNode]

Map of the elementary variable nodes to their associated instance nodes.

Source code in src/raesl/compile/instantiating/node_building.py
def make_variable_instmap(
    var: VarParam, varname: List[str], node: VarNode
) -> Dict[ElementaryVarNode, InstNode]:
    """Construct instance nodes for the provided variable.

    Arguments:
        var: Variable represented by the node.
        varname: Collected prefixes of the dotted name so far.
        node: Node to associate with one or more inst nodes.

    Returns:
        Map of the elementary variable nodes to their associated instance nodes.
    """
    assert var.is_variable

    if isinstance(node, ElementaryVarNode):
        instnode = InstNode(".".join(varname), node)
        instnode.add_comment(node.get_comment())
        return {node: instnode}
    else:
        assert isinstance(node, CompoundVarNode)
        varname.append("")  # Will be overwritten in upcoming loop
        inst_map = {}
        for cn in node.child_nodes:
            varname[-1] = cn.name_tok.tok_text
            inst_map.update(make_variable_instmap(var, varname, cn))
        del varname[-1]
        return inst_map

make_variable_node

make_variable_node(
    v: InstNode,
    type_inst_map: Dict[ElementaryType, TypeDef],
) -> Node

Node creation for a variable.

Parameters:

Name Type Description Default
v InstNode

The variable for which a nodes must be created.

required

Returns:

Type Description
Node

Node of kind "variable".

Source code in src/raesl/compile/instantiating/node_building.py
def make_variable_node(v: InstNode, type_inst_map: Dict[ElementaryType, TypeDef]) -> Node:
    """Node creation for a variable.

    Arguments:
        v: The variable for which a nodes must be created.

    Returns:
        Node of kind "variable".
    """
    a = _make_variable_annotations(v=v, type_inst_map=type_inst_map)
    return Node(name=v.name, kind="variable", annotations=a)

post_process_comments

post_process_comments(nodes: List[Node]) -> None

Post-processing comments attached to nodes.

Parameters:

Name Type Description Default
nodes List[Node]

List of nodes for which the comments must be post-processed.

required
Note

This is a simple implementation to process documentation tags as described in LEP0008.

Source code in src/raesl/compile/instantiating/node_building.py
def post_process_comments(nodes: List[Node]) -> None:
    """Post-processing comments attached to nodes.

    Arguments:
        nodes: List of nodes for which the comments must be post-processed.

    Note:
        This is a simple implementation to process documentation tags as described
        in LEP0008.
    """
    for n in nodes:
        cms = n.annotations.esl_info.get("comments", [])
        plain_comments = []
        tagged_comments = defaultdict(list)

        for cm in cms:
            match = TAG_COMMENT_PAT.match(cm)
            if match is None:
                plain_comments.append(cm)
            else:
                groups = match.groupdict()
                tagged_comments[groups["tag"]].append(groups["comment"])

        n.annotations.esl_info["comments"] = plain_comments
        n.annotations.esl_info["tagged_comments"] = tagged_comments

machine_files

State machine files for recognizing lines and ESL.

collect_line_machines

collect_line_machines() -> (
    Dict[str, ProcessingStateMachine]
)

Construct and collect all available line machine, and return them.

Source code in src/raesl/compile/machine_files/__init__.py
def collect_line_machines() -> Dict[str, builder.ProcessingStateMachine]:
    """Construct and collect all available line machine, and return them."""
    global _LINE_MACHINES

    if _LINE_MACHINES is not None:
        return _LINE_MACHINES

    _LINE_MACHINES = {}
    mach_builder = builder.StateMachineBuilder()

    collections: List[List[Tuple[str, str, Optional[typing.ProcessingFunc]]]] = [
        behavior.MACHINES,
        comments.MACHINES,
        component_definitions.MACHINES,
        component_instances.MACHINES,
        designs.MACHINES,
        goals.MACHINES,
        groups.MACHINES,
        needs.MACHINES,
        parameters.MACHINES,
        relation_definitions.MACHINES,
        relation_instances.MACHINES,
        transforms.MACHINES,
        type_defs.MACHINES,
        variables.MACHINES,
        verb_defs.MACHINES,
    ]

    for mach_collection in collections:
        for name, spec_text, proc_func in mach_collection:
            mach = mach_builder.create(spec_text, proc_func)
            _LINE_MACHINES[name] = mach

    return _LINE_MACHINES

argument_list

Line matcher for argument lists.

Note this file is imported from other machine files rather than providing argument list processing itself.

process_argument_list_line

process_argument_list_line(
    tags: Dict[str, List[Token]]
) -> List[Token]

Extract the argument names from the collected tags.

Source code in src/raesl/compile/machine_files/argument_list.py
def process_argument_list_line(tags: Dict[str, List["Token"]]) -> List["Token"]:
    """Extract the argument names from the collected tags."""
    return tags["argument"]

behavior

Line matchers for the behavior section.

builder

Code to construct line matching state machines.

The entry point to create a state machine is calling 'create' on a StateMachineBuilder instance.

A state machine has locations and edges.

One location in a machine may have an 'initial' option denoting it is the first location of the machine. Locations may also be accepting, denoting it is a proper end state. The accepting option takes a name, making it possible to distinguish which end location of the state machine was reached.

Edges start at a location and end at a location. Edges have a token-name, and may only be chosen when the input has a token with the same name. An edge may also have a tag option. The tag-name represents what kind of relevant token it is. When an edge is taken with a tag-name, the token matching with the edge is appended to a list associated with the tag-name. In this way, it is possible to extract relevant information from the matched sentence afterwards.

State machine execution assumes the machine is deterministic. From a location, each outgoing edge must have a different token-name. Note that a scanner may not map unique text to a token. The latter is resolved by sorting the edges from most specific token-name to least specific token-name.

In general, writing code to build locations and edges is bulky and hard to understand. Instead a text format is defined to allow writing the state machines as text, and let the computer build the state machine from the description. The text format accepted by the builder is:

1
2
3
4
5
6
7
8
state-machine ::=
  MACHINE-NAME ":" { loc-def | edge-def }+

loc-def ::=
  LOC-NAME [ "initial" ] [ "accept" "=" ACCEPT-name ] ";"

edge-def ::=
  LOC-NAME "->" LOC-NAME "[" TOKEN-NAME "]" [ "tag" "=" TAG-NAME ] ";"

Locations in 'loc-def' must not exist already. Missing locations in 'edge-def' are silently created. Token names are defined in parsing/scanner.py.

ProcessingStateMachine

ProcessingStateMachine(
    name: str,
    processing_func: Optional[ProcessingFunc] = None,
)

Bases: StateMachine

Extended StateMachine that also holds a callback function to add extracted parsing information into the ast.

Parameters:

Name Type Description Default
name str

Name of the state machine, also the name of the matched sequence.

required
processing_func Optional[ProcessingFunc]

If not None, function that inserts relevant information from the matched line into the ast that is constructed.

None
Source code in src/raesl/compile/machine_files/builder.py
def __init__(self, name: str, processing_func: Optional[typing.ProcessingFunc] = None):
    super().__init__(name)
    self.processing_func = processing_func

StateMachineBuilder

StateMachineBuilder()

Class for easily constructing a state machine from a textual description.

Attributes:

Name Type Description
lexer

Lexer to tokenize the input text.

parser

Parser to interpret the tokens.

machine

State machine to be built.

locs

Temporary location map filled while building the state machine.

Source code in src/raesl/compile/machine_files/builder.py
def __init__(self):
    self.lexer = StateMachineLexer()
    self.parser = StateMachineParser()
    self.machine = None
    self.locs = None
create
create(
    machine_text: str,
    processing_func: Optional[ProcessingFunc] = None,
) -> ProcessingStateMachine

Create a state machine by interpreting the provided state machine text.

Parameters:

Name Type Description Default
machine_text str

Description of the state machine as defined by StateMachineLexer and StateMachineParser.

required
processing_func Optional[ProcessingFunc]

Optional processing function to copy relevant information into the abstract syntax tree.

None

Returns:

Type Description
ProcessingStateMachine

The state machine that implements the description.

Source code in src/raesl/compile/machine_files/builder.py
def create(
    self, machine_text: str, processing_func: Optional[typing.ProcessingFunc] = None
) -> ProcessingStateMachine:
    """Create a state machine by interpreting the provided state machine text.

    Arguments:
        machine_text: Description of the state machine as defined by
            StateMachineLexer and StateMachineParser.
        processing_func: Optional processing function to copy relevant
            information into the abstract syntax tree.

    Returns:
        The state machine that implements the description.
    """
    try:
        parsed = self.parser.parse(self.lexer.tokenize(machine_text))
    except sly.lex.LexError as ex:
        logger.error(f"LEX ERROR: {ex}")
        parsed = None

    # DEBUG: Dump input text with line numbers is parsing of input text fails.
    if not parsed:
        for i, line in enumerate(machine_text.split("\n")):
            logger.debug(f"{i + 1:3d}: {line}")

    assert parsed

    mname, mlines = parsed
    self.machine = ProcessingStateMachine(mname, processing_func)
    self.locs = {}

    for mline in mlines:
        if mline[0] == "loc":
            loc_name = mline[1]
            loc_opts = mline[2]
            self.create_loc(loc_name, loc_opts)

        elif mline[0] == "edge":
            edge_source = mline[1]
            edge_dest = mline[2]
            edge_toktype = mline[3]
            edge_opts = mline[4]

            source = self.ensure_loc(edge_source)
            dest = self.ensure_loc(edge_dest)
            tag = None
            for opt in edge_opts:
                assert opt[0] == "tag"
                tag = opt[1]
                break

            edge = Edge(dest, edge_toktype, tag)
            source.out_edges.append(edge)

        else:
            assert False, "Unexpected machine line " + repr(mline)

    # Some sanity checks.
    assert self.machine.initial_loc  # There should be an initial location.

    self.machine.sort_edges()
    machine = self.machine
    self.machine = None
    self.locs = None
    return machine
create_loc
create_loc(
    name: str,
    opts: List[Union[Tuple[str], Tuple[str, str]]],
) -> Location

Create a new location.

Parameters:

Name Type Description Default
name str

Name of the location to create.

required
opts List[Union[Tuple[str], Tuple[str, str]]]

Location options, list of ('initial',) and/or ('accept', str) .

required

Returns:

Type Description
Location

The created location.

Source code in src/raesl/compile/machine_files/builder.py
def create_loc(self, name: str, opts: List[Union[Tuple[str], Tuple[str, str]]]) -> Location:
    """Create a new location.

    Arguments:
        name: Name of the location to create.
        opts: Location options, list of ('initial',) and/or ('accept', str) .

    Returns:
        The created location.
    """
    initial = False
    accept = None
    for opt in opts:
        if opt[0] == "initial":
            initial = True
        elif opt[0] == "accept":
            accept = opt[1]
        else:
            assert False, "Unexpected loc option " + repr(opt)

    loc = Location(name, accept)
    assert name not in self.locs
    self.locs[name] = loc
    if initial:
        self.machine.initial_loc = loc
    return loc
ensure_loc
ensure_loc(name: str) -> Location

Make sure a location with the provided name exists. If it doesn't, create it.

Parameters:

Name Type Description Default
name str

Name of the location to ensure.

required

Returns:

Type Description
Location

The location with the provided name.

Source code in src/raesl/compile/machine_files/builder.py
def ensure_loc(self, name: str) -> Location:
    """Make sure a location with the provided name exists. If it doesn't, create it.

    Arguments:
        name: Name of the location to ensure.

    Returns:
        The location with the provided name.
    """
    loc = self.locs.get(name)
    if loc is not None:
        return loc

    return self.create_loc(name, [])

StateMachineLexer

Bases: Lexer

Lexer for tokenizing state machine descriptions.

Note that SLY and pylint are known not to like each other.

comments

Line matcher state machines for comments.

component_definitions

Line matcher state machines for transformations.

component_instances

Line matchers for component instances within world or a component definition.

designs

Line matcher state machines for designs.

goals

Line matcher state machines for goals.

groups

Line matching state machines for groups.

machine_parts

Library with common parts of line matchers.

get_argument_references_part

get_argument_references_part(
    start_loc: str,
    end_1locs: List[str],
    tagname: str,
    prefix: str = "",
) -> str

State machine part implementing the 'argument-references' rule.

argument-references ::= argument-name { and-connector argument-name }

and-connector ::= "and" | "," | "," "and"

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_argument_references_part(
    start_loc: str, end_1locs: List[str], tagname: str, prefix: str = ""
) -> str:
    """State machine part implementing the 'argument-references' rule.

    argument-references ::=
        argument-name { and-connector argument-name } \n

    and-connector ::=
        "and" | "," | "," "and"
    """
    assert len(end_1locs) == 1

    locs = {"start": start_loc, "end": end_1locs[0], "tagname": tagname}
    locs.update(make_loc_names(prefix, "argrefs", 1))

    text = """\
        {argrefs1};

        {start}    -> {end}      [DOTTEDNAME] tag={tagname};
        {end}      -> {start}    [AND_KW];
        {end}      -> {argrefs1} [COMMA_TK];
        {argrefs1} -> {end}      [DOTTEDNAME] tag={tagname};
        {argrefs1} -> {start}    [AND_KW];
    """

    return text.format(**locs)

get_auxiliary_verb_part

get_auxiliary_verb_part(
    start_loc: str, end_1locs: List[str], tagname: str
) -> str

Part implementing the 'auxiliary-verb' rule.

auxiliary-verb ::= "shall" | "must" | "should" | "could" | "won't"

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_auxiliary_verb_part(start_loc: str, end_1locs: List[str], tagname: str) -> str:
    """Part implementing the 'auxiliary-verb' rule.

    auxiliary-verb ::=
        "shall" | "must" | "should" | "could" | "won't"
    """
    assert len(end_1locs) == 1

    text = """\
        {start} -> {end} [SHALL_KW] tag={tagname};
        {start} -> {end} [MUST_KW] tag={tagname};
        {start} -> {end} [SHOULD_KW] tag={tagname};
        {start} -> {end} [COULD_KW] tag={tagname};
        {start} -> {end} [WONT_KW] tag={tagname};
    """
    return text.format(start=start_loc, end=end_1locs[0], tagname=tagname)

get_compare_op_part

get_compare_op_part(
    start_loc: str,
    end_1locs: List[str],
    tagname: str,
    prefix: str = "",
) -> str

State machine part implementing the 'compare-op' rule.

compare-op ::= "smaller" "than" | "greater" "than" | "not" "equal" "to" | "equal" "to" | "at" "least" | "at" "most" | "approximately"

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_compare_op_part(
    start_loc: str, end_1locs: List[str], tagname: str, prefix: str = ""
) -> str:
    """State machine part implementing the 'compare-op' rule.

    compare-op ::=
        "smaller" "than" | "greater" "than" | "not" "equal" "to" |
        "equal" "to" | "at" "least" | "at" "most" | "approximately"
    """
    assert len(end_1locs) == 1

    locs = {"start": start_loc, "end": end_1locs[0], "tagname": tagname}
    locs.update(make_loc_names(prefix, "cmp", 5))

    text = """\
        cmp1; cmp2; cmp3; cmp4; cmp5;

        {start} -> cmp1 [SMALLER_KW] tag={tagname};
        {start} -> cmp1 [GREATER_KW] tag={tagname};
            cmp1 -> {end} [THAN_KW];
        {start} -> cmp2 [EQUAL_KW] tag={tagname};
            cmp2 -> {end} [TO_KW];
        {start} -> cmp3 [NOT_KW] tag={tagname};
            cmp3 -> cmp4 [EQUAL_KW];
            cmp4 -> {end} [TO_KW];
        {start} -> cmp5 [AT_KW];
            cmp5 -> {end} [LEAST_KW] tag={tagname};
            cmp5 -> {end} [MOST_KW] tag={tagname};
        {start} -> {end} [APPROXIMATELY_KW] tag={tagname};
    """

    return text.format(**locs)

get_disjunctive_comparison_part

get_disjunctive_comparison_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str

State machine part implementing the 'comparison-rule-line' comparisons.

comparison-rule-line ::= comparison { "or" comparison }

comparison ::= argument-name ( constraint-rule-literal | requirement-rule-literal )

constraint-rule-literal ::= "is" compare-op bound

requirement-rule-literal ::= auxiliary-verb "be" ( compare-op bound | objective )

compare-op ::= "smaller" "than" | "greater" "than" | "not" "equal" "to" | "equal" "to" | "at" "least" | "at" "most" | "approximately"

bound ::= argument-name | VALUE [ UNIT ] | "t.b.d." [ UNIT ]

objective ::= "maximized" | "minimized"

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_disjunctive_comparison_part(start_loc: str, end_2locs: List[str], prefix: str = "") -> str:
    """State machine part implementing the 'comparison-rule-line' comparisons.

    comparison-rule-line ::=
        comparison { "or" comparison }

    comparison ::=
        argument-name ( constraint-rule-literal | requirement-rule-literal )

    constraint-rule-literal ::=
        "is" compare-op bound

    requirement-rule-literal ::=
        auxiliary-verb "be" ( compare-op bound | objective )

    compare-op ::=
        "smaller" "than" | "greater" "than" | "not" "equal" "to" |
        "equal" "to" | "at" "least" | "at" "most" | "approximately"

    bound ::=
        argument-name | VALUE [ UNIT ] | "t.b.d." [ UNIT ]

    objective ::=
        "maximized" | "minimized"
    """
    assert len(end_2locs) == 2

    locs = {"start": start_loc, "end1": end_2locs[0], "end2": end_2locs[1]}
    locs.update(make_loc_names(prefix, "dis", 3))

    text = (
        """\
        {dis1}; {dis2}; {dis3};

        {start} -> {dis1} [DOTTEDNAME] tag=first_var;
    """
        + get_is_auxiliary_verb_part(locs["dis1"], [locs["dis2"]], tagname="is_aux")
        + """\
        {dis2} -> {end2} [MAXIMIZED_KW] tag=objective;
        {dis2} -> {end2} [MINIMIZED_KW] tag=objective;
    """
        + get_compare_op_part(locs["dis2"], [locs["dis3"]], tagname="compare_op")
        + """\
        {dis3} -> {end1} [NONCOMMA] tag=varvalue;
        {end1} -> {end2} [NONCOMMA] tag=unit;

        {end1} -> {start} [OR_KW] tag=or;
        {end2} -> {start} [OR_KW] tag=or;
    """
    )

    return text.format(**locs)

get_does_auxiliary_verb_part

get_does_auxiliary_verb_part(
    start_loc: str, end_1locs: List[str], tagname: str
) -> str

State machine part that implements:

"does" | auxiliary-verb

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_does_auxiliary_verb_part(start_loc: str, end_1locs: List[str], tagname: str) -> str:
    """State machine part that implements:

    "does" | auxiliary-verb
    """
    assert len(end_1locs) == 1

    text = """\
        {start} -> {end} [DOES_KW] tag={tagname};
    """ + get_auxiliary_verb_part(
        start_loc, [end_1locs[0]], tagname
    )

    return text.format(start=start_loc, end=end_1locs[0], tagname=tagname)

get_is_auxiliary_verb_part

get_is_auxiliary_verb_part(
    start_loc: str,
    end_1locs: List[str],
    tagname: str,
    prefix: str = "",
) -> str

State machine part implementing:

"is" | auxiliary-verb "be"

Source code in src/raesl/compile/machine_files/machine_parts.py
def get_is_auxiliary_verb_part(
    start_loc: str, end_1locs: List[str], tagname: str, prefix: str = ""
) -> str:
    """State machine part implementing:

    "is" | auxiliary-verb "be"
    """
    assert len(end_1locs) == 1

    locs = {"start": start_loc, "end": end_1locs[0], "tagname": tagname}
    locs.update(make_loc_names(prefix, "isauxloc", 1))

    text = """\
        {isauxloc1};

        {start} -> {end} [IS_KW] tag={tagname};
        {isauxloc1} -> {end} [BE_KW];
    """ + get_auxiliary_verb_part(
        start_loc, [locs["isauxloc1"]], tagname
    )

    return text.format(**locs)

needs

Line matcher state machines needs.

parameters

Line matcher state machines for parameters.

relation_definitions

Line matcher state machines of relation definitions.

relation_instances

Line matcher state machines for relation instances.

sub_clause

Line matcher for sub-clause lines.

decode_disjunctive_comparisons

decode_disjunctive_comparisons(
    tags: Dict[str, List[Token]]
) -> Expression

Decode tags of a matched 'machine_parts.get_disjunctive_comparison_part' part to an disjunction with comparisons.

Parameters:

Name Type Description Default
tags Dict[str, List[Token]]

Extracted data from a match of the machine defined in 'machine_parts.get_disjunctive_comparison_part'.

required

Returns:

Type Description
Expression

The expression equivalent to the matched text.

Source code in src/raesl/compile/machine_files/sub_clause.py
def decode_disjunctive_comparisons(tags: Dict[str, List["Token"]]) -> exprs.Expression:
    """Decode tags of a matched 'machine_parts.get_disjunctive_comparison_part' part
    to an disjunction with comparisons.

    Arguments:
        tags: Extracted data from a match of the machine defined in
            'machine_parts.get_disjunctive_comparison_part'.

    Returns:
        The expression equivalent to the matched text.
    """
    split_offsets = (
        cast(List[Optional[int]], [None]) + [tok.offset for tok in tags.get("or", [])] + [None]
    )
    equations: List[exprs.Expression] = []

    unit_tags = tags.get("unit", [])
    compare_tags = tags.get("compare_op", [])
    for index in range(1, len(split_offsets)):
        start_offset = split_offsets[index - 1]
        end_offset = split_offsets[index]

        first_var = get_one(tags["first_var"], start_offset, end_offset)
        lhs = exprs.VariableValue(first_var)

        is_aux = get_one(tags["is_aux"], start_offset, end_offset)
        is_constraint = is_aux.tok_text == "is"

        comp: exprs.Comparison
        compare_op = get_optional(compare_tags, start_offset, end_offset)
        if compare_op is not None:
            varvalue = get_one(tags["varvalue"], start_offset, end_offset)
            unit = get_optional(unit_tags, start_offset, end_offset)

            rhs: exprs.DataValue
            if unit is None and guess_is_var(varvalue):
                rhs = exprs.VariableValue(varvalue)
            else:
                rhs = exprs.Value(varvalue, unit)

            comp = exprs.RelationComparison(is_constraint, lhs, is_aux, compare_op, rhs)
            equations.append(comp)

        else:
            objective = get_one(tags["objective"], start_offset, end_offset)

            comp = exprs.ObjectiveComparison(lhs, is_aux, objective.tok_type == "MAXIMIZED_KW")
            equations.append(comp)

    if len(equations) == 1:
        return equations[0]
    else:
        return exprs.Disjunction(equations)

decode_subclause

decode_subclause(tags: Dict[str, List[Token]]) -> SubClause

Decode tags of a matched subclauses line to one or more disjunctive equations.

Parameters:

Name Type Description Default
tags Dict[str, List[Token]]

Extracted data from a match of the machine defined in SUB_CLAUSE_SPEC.

required

Returns:

Type Description
SubClause

The found subclause.

Source code in src/raesl/compile/machine_files/sub_clause.py
def decode_subclause(tags: Dict[str, List["Token"]]) -> components.SubClause:
    """Decode tags of a matched subclauses line to one or more disjunctive equations.

    Arguments:
        tags: Extracted data from a match of the machine defined in SUB_CLAUSE_SPEC.

    Returns:
        The found subclause.
    """
    label = tags["subclause_label"][0]
    condition = decode_disjunctive_comparisons(tags)
    return components.SubClause(label, condition)

guess_is_var

guess_is_var(varvalue: Token) -> bool

Guess whether the provided token is a variable or a value. (Answer: If it is not "t.b.d." and starts with a letter it's a variable.)

Source code in src/raesl/compile/machine_files/sub_clause.py
def guess_is_var(varvalue: "Token") -> bool:
    """Guess whether the provided token is a variable or a value.
    (Answer: If it is not "t.b.d." and starts with a letter it's a variable.)
    """
    text = varvalue.tok_text
    if re.fullmatch("[tT]\\.[Bb]\\.[dD]\\.", text):
        # Some form of TBD is not a variable.
        return False

    first = text[0]
    return first in string.ascii_lowercase or first in string.ascii_uppercase

transforms

Line matcher state machines for transformations.

type_defs

Line matcher state machines for type definitions.

get_constant_specification_part

get_constant_specification_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str

State machine part implementing the 'constant-specification' rule.

constant-specification ::= "equal" "to" VALUE [ UNIT ]

Source code in src/raesl/compile/machine_files/type_defs.py
def get_constant_specification_part(start_loc: str, end_2locs: List[str], prefix: str = "") -> str:
    """State machine part implementing the 'constant-specification' rule.

    constant-specification ::=
        "equal" "to" VALUE [ UNIT ]
    """
    assert len(end_2locs) == 2

    locs = {
        "start": start_loc,
        "end1": end_2locs[0],
        "end2": end_2locs[1],
    }
    locs.update(make_loc_names(prefix, "const", 4))

    text = """\
        {const1}; {const2};

        {start}  -> {const1} [EQUAL_KW] tag=has_constant_spec;
        {const1} -> {const2} [TO_KW];
        {const2} -> {end1} [NONCOMMA] tag=const_value;
        {end1}   -> {end2} [NONCOMMA] tag=const_unit;
    """
    return text.format(**locs)

get_enumeration_specification_part

get_enumeration_specification_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str

State machine part implementing the 'enumeration-specification' rule.

enumeration-specification ::= "is" "an" "enumeration" "of" VALUE [ UNIT ] { "," VALUE [ UNIT ]}

Source code in src/raesl/compile/machine_files/type_defs.py
def get_enumeration_specification_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str:
    """State machine part implementing the 'enumeration-specification' rule.

    enumeration-specification ::=
        "is" "an" "enumeration" "of" VALUE [ UNIT ] { "," VALUE [ UNIT ]}
    """
    assert len(end_2locs) == 2

    lenum_locs = make_loc_names(prefix, "lenum", 2)
    locs = {
        "start": start_loc,
        "end": lenum_locs["lenum2"],
    }
    locs.update(lenum_locs)

    text = """
        {lenum1}; {lenum2};

        {start} -> {lenum1} [IS_KW];
        {lenum1} -> {end} [A_KW];
    """
    text = text.format(**locs)
    return text + get_short_enumeration_specification_part(locs["end"], end_2locs, prefix)

get_interval_specification_part

get_interval_specification_part(
    start_loc: str, end_4locs: List[str], prefix: str = ""
) -> str

State machine part implementing the 'interval-specification' rule.

interval-specification ::= "of" interval { "or" interval }

Source code in src/raesl/compile/machine_files/type_defs.py
def get_interval_specification_part(start_loc: str, end_4locs: List[str], prefix: str = "") -> str:
    """State machine part implementing the 'interval-specification' rule.

    interval-specification ::=
        "of" interval { "or" interval }
    """
    assert len(end_4locs) == 4

    locs = {
        "start": start_loc,
        "end1": end_4locs[0],
        "end2": end_4locs[1],
        "end3": end_4locs[2],
        "end4": end_4locs[3],
    }
    locs.update(make_loc_names(prefix, "intval", 6))

    text = """\
        {intval1}; {intval2}; {intval3}; {intval4}; {intval5}; {intval6};

        {start} -> {intval1} [OF_KW] tag=has_intval_spec;

        {intval1} -> {intval2} [AT_KW];
        {intval2} -> {intval3} [LEAST_KW];
        {intval3} -> {end1}    [NONCOMMA] tag=lowerbound_value;
        {end1}    -> {end2}    [NONCOMMA] tag=lowerbound_unit;

        {end1}    -> {intval4} [AND_KW] tag=and;
        {end2}    -> {intval4} [AND_KW] tag=and;

        {intval4} -> {intval5} [AT_KW];
        {intval5} -> {intval6} [MOST_KW];
        {intval6} -> {end3}    [NONCOMMA] tag=upperbound_value;
        {end3}    -> {end4}    [NONCOMMA] tag=upperbound_unit;

        {intval2} -> {intval6} [MOST_KW]; # Skip 'at least'

        {end1}    -> {intval1} [OR_KW] tag=or; # 'or' after lower bound
        {end2}    -> {intval1} [OR_KW] tag=or;

        {end3}   -> {intval1}  [OR_KW] tag=or; # 'or' after upper bound
        {end4}   -> {intval1}  [OR_KW] tag=or;
    """
    return text.format(**locs)

get_short_enumeration_specification_part

get_short_enumeration_specification_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str

State machine part implementing the short 'enumeration-specification' rule.

short-enumeration-specification ::= "enumeration" "of" VALUE [ UNIT ] { "," VALUE [ UNIT ]}

Source code in src/raesl/compile/machine_files/type_defs.py
def get_short_enumeration_specification_part(
    start_loc: str, end_2locs: List[str], prefix: str = ""
) -> str:
    """State machine part implementing the short 'enumeration-specification' rule.

    short-enumeration-specification ::=
        "enumeration" "of" VALUE [ UNIT ] { "," VALUE [ UNIT ]}
    """
    assert len(end_2locs) == 2

    locs = {
        "start": start_loc,
        "end1": end_2locs[0],
        "end2": end_2locs[1],
    }
    locs.update(make_loc_names(prefix, "enum", 2))

    text = """\
        {enum1}; {enum2};

        {start} -> {enum1} [ENUMERATION_KW] tag=has_enum_spec;
        {enum1} -> {enum2} [OF_KW];
        {enum2} -> {end1} [NONCOMMA] tag=enum_value;
        {end1} -> {end2} [NONCOMMA] tag=enum_unit;

        {end1} -> {enum2} [COMMA_TK];
        {end2} -> {enum2} [COMMA_TK];
    """
    return text.format(**locs)

get_unit_specification_part

get_unit_specification_part(
    start_loc: str, end_1locs: List[str], prefix: str = ""
) -> str

State machine part implementing the 'unit-specification' rule.

unit-specification ::= "with" ( "unit" | "units" ) UNIT-NAME { "," UNIT-NAME }

Source code in src/raesl/compile/machine_files/type_defs.py
def get_unit_specification_part(start_loc: str, end_1locs: List[str], prefix: str = "") -> str:
    """State machine part implementing the 'unit-specification' rule.

    unit-specification ::=
        "with" ( "unit" | "units" ) UNIT-NAME { "," UNIT-NAME }
    """
    assert len(end_1locs) == 1

    locs = {
        "start": start_loc,
        "end": end_1locs[0],
    }
    locs.update(make_loc_names(prefix, "uspec", 2))

    text = """\
        {uspec1}; {uspec2};

        {start} -> {uspec1} [WITH_KW];
        {uspec1} -> {uspec2} [UNIT_KW] tag=has_unit_spec;
        {uspec2} -> {end} [NONCOMMA] tag=unit_name;
        {end} -> {uspec2} [COMMA_TK];
    """
    return text.format(**locs)

typing

Typing support for the machines data.

utils

Utility functions.

get_one

get_one(
    tokens: List[Token],
    start_offset: Optional[int],
    end_offset: Optional[int],
) -> Token

Filter tokens on the provided start and end offsets, and return the only token between the positions.

Source code in src/raesl/compile/machine_files/utils.py
def get_one(tokens: List[Token], start_offset: Optional[int], end_offset: Optional[int]) -> Token:
    """Filter tokens on the provided start and end offsets, and return the only token
    between the positions.
    """
    matches = [tok for tok in tokens if _in_range(tok, start_offset, end_offset)]
    assert len(matches) == 1
    return matches[0]

get_optional

get_optional(
    tokens: List[Token],
    start_offset: Optional[int],
    end_offset: Optional[int],
) -> Optional[Token]

Filter tokens on the provided start and end offsets, and return the only token between the positions.

Source code in src/raesl/compile/machine_files/utils.py
def get_optional(
    tokens: List[Token], start_offset: Optional[int], end_offset: Optional[int]
) -> Optional[Token]:
    """Filter tokens on the provided start and end offsets, and return the only token
    between the positions.
    """
    matches = [tok for tok in tokens if _in_range(tok, start_offset, end_offset)]
    assert len(matches) < 2
    if matches:
        return matches[0]
    return None

make_loc_names

make_loc_names(
    prefix: str, name: str, count: int
) -> Dict[str, str]

Make a dict with 'count' names by combining the prefix, name, and a number.

Source code in src/raesl/compile/machine_files/utils.py
7
8
9
def make_loc_names(prefix: str, name: str, count: int) -> Dict[str, str]:
    """Make a dict with 'count' names by combining the prefix, name, and a number."""
    return dict((name + str(num), prefix + name + str(num)) for num in range(1, count + 1))

variables

Line matcher state machines for variable declarations.

verb_defs

Line matcher state machines for verb and pre-position definitions.

parser

Parsing the Elephant Specification Language.

LineMachineStepper

LineMachineStepper(
    machine: ProcessingStateMachine,
    lexer: Lexer,
    dest_loc: Location,
)

Class managing parsing of a line in ESL.

Parameters:

Name Type Description Default
machine ProcessingStateMachine

Line matching machine to use for recognizing the line.

required
lexer Lexer

Lexer to use in the matching process.

required
dest_loc Location

New specification location if a match was found.

required

Attributes:

Name Type Description
current_loc Optional[Location]

Current location in the line machine.

tags Dict[str, List[Token]]

Relevant data extracted from the text line during the parsing process.

matched_tokens List[Token]

Tokens used for matching the line thus far.

Source code in src/raesl/compile/parser.py
def __init__(self, machine: "ProcessingStateMachine", lexer: "Lexer", dest_loc: "Location"):
    self.machine = machine
    self.lexer = lexer
    self.current_loc: Optional["Location"] = machine.initial_loc
    self.tags: Dict[str, List["Token"]] = {}
    self.matched_tokens: List["Token"] = []

    self.dest_loc = dest_loc

get_accept_name

get_accept_name() -> str

Get the acceptance name associated with the accepting location.

Returns:

Type Description
str

Name of the accepted 'rule'.

Source code in src/raesl/compile/parser.py
def get_accept_name(self) -> str:
    """Get the acceptance name associated with the accepting location.

    Returns:
        Name of the accepted 'rule'.
    """
    assert self.current_loc
    assert self.current_loc.accept is not None
    return self.current_loc.accept

is_accepting

is_accepting() -> bool

Is the machine in an accepting location?

Source code in src/raesl/compile/parser.py
def is_accepting(self) -> bool:
    """Is the machine in an accepting location?"""
    assert self.current_loc
    return self.current_loc.accept is not None

try_step

try_step() -> bool

Try to match the next token.

Returns:

Type Description
bool

Whether progress was made.

Source code in src/raesl/compile/parser.py
def try_step(self) -> bool:
    """Try to match the next token.

    Returns:
        Whether progress was made.
    """
    if not self.current_loc:
        return False

    match = self.machine.single_step(self.lexer, self.current_loc, self.tags)
    if not match:
        self.current_loc = None
        return False

    self.current_loc = match[0]
    self.matched_tokens.append(match[1])
    return True

collect_locations

collect_locations(spec_state: Location) -> List[Location]

Collect the set reachable states from 'spec_state' without taking only 'epsilon' transitions.

Source code in src/raesl/compile/parser.py
def collect_locations(spec_state: "Location") -> List["Location"]:
    """Collect the set reachable states from 'spec_state' without taking only 'epsilon'
    transitions.
    """
    reachables = [spec_state]
    notdone = [spec_state]
    while notdone:
        spec_loc = notdone.pop()
        for edge in spec_loc.out_edges:
            if edge.tok_type == "epsilon":
                new_loc = edge.dest
                if (
                    new_loc not in reachables
                ):  # Assuming a msall number of location in 'reachables'.
                    reachables.append(new_loc)
                    notdone.append(new_loc)
    return reachables

parse_lexer

parse_lexer(
    lexer: Lexer,
    diag_store: Optional[DiagnosticStore],
    builder: Optional[AstBuilder],
    doc_comments: Optional[List[Token]],
) -> Tuple[DiagnosticStore, AstBuilder, List[Token], bool]

Parse an ESL lexer, storing collected information in the builder.

Parameters:

Name Type Description Default
lexer Lexer

Lexer pointing at the start of the specification text.

required
diag_store Optional[DiagnosticStore]

Diagnostic store if one already has been created.

required
builder Optional[AstBuilder]

Builder if one already has been created.

required
doc_comments Optional[List[Token]]

Doc comments if any have been found yet.

required

Returns:

Type Description
DiagnosticStore

Diagnostic store instance.

AstBuilder

Builder instance.

List[Token]

Found doc comments.

bool

Whether there has been an error.

Source code in src/raesl/compile/parser.py
def parse_lexer(
    lexer: "Lexer",
    diag_store: Optional[diagnostics.DiagnosticStore],
    builder: Optional[ast_builder.AstBuilder],
    doc_comments: Optional[List["Token"]],
) -> Tuple[diagnostics.DiagnosticStore, ast_builder.AstBuilder, List["Token"], bool]:
    """Parse an ESL lexer, storing collected information in the builder.

    Arguments:
        lexer: Lexer pointing at the start of the specification text.
        diag_store: Diagnostic store if one already has been created.
        builder: Builder if one already has been created.
        doc_comments: Doc comments if any have been found yet.

    Returns:
        Diagnostic store instance.
        Builder instance.
        Found doc comments.
        Whether there has been an error.
    """
    diag_store = diagnostics.DiagnosticStore() if diag_store is None else diag_store
    builder = ast_builder.AstBuilder(diag_store) if builder is None else builder
    doc_comments = [] if doc_comments is None else doc_comments

    assert esl_lines.ESL_MACHINE.initial_loc is not None
    spec_state: Location = esl_lines.ESL_MACHINE.initial_loc
    while True:
        new_spec_state, new_lexer = parse_line(spec_state, lexer, builder, diag_store)
        if new_lexer is None:
            # Something bad happened, message should be in problem storage.
            doc_comments.extend(lexer.doc_comments)
            return diag_store, builder, doc_comments, True

        if new_spec_state is None:
            # EOF reached in an accepting state, done!
            # Check the collected data and construct an AST, return the found
            # diagnostics and the created specification if possible.
            doc_comments.extend(lexer.doc_comments)
            return diag_store, builder, doc_comments, False

        # Else, matched one line, do the next.
        spec_state = new_spec_state
        lexer = new_lexer

parse_line

parse_line(
    spec_state: Location,
    lexer: Lexer,
    builder: AstBuilder,
    diag_store: DiagnosticStore,
) -> Tuple[Optional[Location], Optional[Lexer]]

Parse a text-line in ESL.

For debugging line selection, set scanner.PARSER_DEBUG to True, which enables printing debug information to the std output. For best results, use a small input specification, output is quite verbose.

Parameters:

Name Type Description Default
spec_state Location

Location in the ESL language state machine.

required
lexer Lexer

Lexer pointing at the start of the next line to match.

required
builder AstBuilder

Class storing extracted parse data.

required
diag_store DiagnosticStore

Storage for reported diagnostics.

required

Returns:

Type Description
Optional[Location]

Next state in the ESL state machine unless successfully finished, next lexer to

Optional[Lexer]

use if next step can be performed.

Source code in src/raesl/compile/parser.py
def parse_line(
    spec_state: "Location",
    lexer: "Lexer",
    builder: ast_builder.AstBuilder,
    diag_store: diagnostics.DiagnosticStore,
) -> Tuple[Optional["Location"], Optional["Lexer"]]:
    """Parse a text-line in ESL.

    For debugging line selection, set scanner.PARSER_DEBUG to True, which enables
    printing debug information to the std output. For best results, use a *small*
    input specification, output is quite verbose.

    Arguments:
        spec_state: Location in the ESL language state machine.
        lexer: Lexer pointing at the start of the next line to match.
        builder: Class storing extracted parse data.
        diag_store: Storage for reported diagnostics.

    Returns:
        Next state in the ESL state machine unless successfully finished, next lexer to
        use if next step can be performed.
    """
    reachable_locs = collect_locations(spec_state)
    logger.debug(f"** line {lexer.line_num + 1} (1-based) *************************")
    rloc_names = ",".join(rl.name for rl in reachable_locs)
    logger.debug(f"** parse_line({spec_state.name} -> [{rloc_names}])")

    # Skip over empty lines.
    while lexer.find("NL_TK"):
        continue

    # Check for EOF.
    if lexer.find("EOF_TK"):
        if any(loc.accept for loc in reachable_locs):
            return None, lexer  # Reached EOF at an accepting state, success!

        # EOF but not expecting it, report a problem.
        diag_store.add(diagnostics.E100(lexer.get_location()))
        return spec_state, None

    # Found a line of text. Find a match.
    #
    # Create line steppers for all possible matches.
    steppers = []
    for spec_loc in reachable_locs:
        for edge in spec_loc.out_edges:
            if edge.tok_type == "epsilon":
                continue

            machine = esl_lines.get_line_machine(edge.tok_type)
            logger.debug(f"** Add '{machine.name}' line machine.")
            steppers.append(LineMachineStepper(machine, lexer.copy(), edge.dest))

    assert steppers  # There should be at least one stepper.

    # Take steps, silently dropping steppers that don't match, until all don't match or
    # at least one stepper is accepting.
    acceptors = []
    while steppers:
        prev_steppers = steppers
        steppers = []
        logger.debug("** -----")
        steppers_text = ", ".join(s.machine.name for s in prev_steppers)
        logger.debug(f"** Steppers remaining: {steppers_text}")
        for stepper in prev_steppers:
            if not stepper.try_step():
                logger.debug(f"** Stepper {stepper.machine.name} didn't match.")
                continue  # Next token didn't match, drop the stepper silently.
            if stepper.is_accepting():
                # Stepper has reached the end, continue with the others.
                logger.debug(f"** Stepper {stepper.machine.name} has reached the end.")
                acceptors.append(stepper)
                continue

            logger.debug(f"** Stepper {stepper.machine.name} matched.")

            steppers.append(stepper)

    # All steppers either failed or finished in an accepting location.
    best_acceptor: Optional[LineMachineStepper]
    if acceptors:
        if len(acceptors) == 1:
            # Life is simple, pick the one and only match.
            best_acceptor = acceptors[0]
        else:
            # Multiple lines match. Filter on the most specific match.
            best_match = None
            best_acceptor = None
            for acceptor in acceptors:
                match_prio = [get_token_priority(tok.tok_type) for tok in acceptor.matched_tokens]
                if best_match is None or match_prio < best_match:
                    best_match = match_prio
                    best_acceptor = acceptor
                elif best_acceptor is not None and match_prio == best_match:
                    ambi_acceptors = (
                        best_acceptor.get_accept_name(),
                        acceptor.get_accept_name(),
                    )
                    best_acceptor = None  # Ambiguous best match (until now)

            # Sanity check and raise an Exception if it fails.
            if best_acceptor is None:
                diag_store.add(diagnostics.E101(ambi_acceptors, location=lexer.get_location()))

        # Store information of the line into the ast builder instance.
        assert best_acceptor is not None
        processing_func = best_acceptor.machine.processing_func
        if processing_func:
            tags = best_acceptor.tags
            accept = best_acceptor.get_accept_name()
            processing_func(tags, accept, builder)

        # Line done.
        return best_acceptor.dest_loc, best_acceptor.lexer

    # No acceptors, thus they all failed to match. Find a stepper that got the furthest.
    # mypy fails on lambdas
    # best_stepper = max(prev_steppers, key=lambda s: s.lexer.get_linecol())
    def linecol_value(line_stepper: LineMachineStepper) -> Tuple[int, int]:
        return line_stepper.lexer.get_linecol()

    # Report a syntax error
    best_stepper = max(prev_steppers, key=linecol_value)
    diag_store.add(diagnostics.E102(location=best_stepper.lexer.get_location()))

    return spec_state, None

parse_spec

parse_spec(
    lexers: Iterable[Lexer],
    diag_store: Optional[DiagnosticStore] = None,
) -> Tuple[DiagnosticStore, Optional[Specification]]

Parse an ESL specification, storing collected information in the builder.

Parameters:

Name Type Description Default
lexers Iterable[Lexer]

Lexers pointing at the start of their respective texts (e.g. per-file).

required

Returns:

Type Description
Tuple[DiagnosticStore, Optional[Specification]]

The found diagnostics, and if successful, the type-checked output.

Source code in src/raesl/compile/parser.py
def parse_spec(
    lexers: Iterable["Lexer"], diag_store: Optional[diagnostics.DiagnosticStore] = None
) -> Tuple[diagnostics.DiagnosticStore, Optional["Specification"]]:
    """Parse an ESL specification, storing collected information in the builder.

    Arguments:
        lexers: Lexers pointing at the start of their respective texts (e.g. per-file).

    Returns:
        The found diagnostics, and if successful, the type-checked output.
    """
    diag_store = diagnostics.DiagnosticStore() if diag_store is None else diag_store
    builder = ast_builder.AstBuilder(diag_store)
    doc_comments: List["Token"] = []

    for lexer in lexers:
        diag_store, builder, doc_comments, error = parse_lexer(
            lexer, diag_store, builder, doc_comments
        )
        if error:
            return diag_store, None

    spec = builder.finish_parse(doc_comments)
    return diag_store, spec

scanner

Lexer for on-demand recognition of tokens.

As the language has unrestrained text in its needs, lexing beforehand is not going to work in all cases. Instead, the scanner tries to match tokens on demand.

Also there is overlap in matching between tokens (a NONSPACE expression matches almost all other tokens as well, and a NAME expression matches all keywords). The rule applied here (by means of sorting edges in the state machines) is that specific wins from generic. For example, if at some point both the OR_KW and the NONSPACE token may be used, and the text is "or", the OR_KW token is chosen.

Lexer

Lexer(
    fname: Optional[str],
    text: str,
    offset: int,
    line_offset: int,
    line_num: int,
    doc_comments: List[Token],
)

On-demand scanner.

For debugging token matching, enable the PARSER_DEBUG flag near the top of the file. That also enables debug output in the parser.parse_line to understand what line is being tried, and which line match steppers are running.

Arguments; fname: Name of the file containing the text, may be None. text: Input text. length: Length of the text. offset: Offset of the current position in the text. line_offset: Offset of the first character of the current line in the input text. line_num: Line number of the current line. doc_comments: Documentation comments found so far, shared between all scanners.

Source code in src/raesl/compile/scanner.py
def __init__(
    self,
    fname: Optional[str],
    text: str,
    offset: int,
    line_offset: int,
    line_num: int,
    doc_comments: List[Token],
):
    self.fname = fname
    self.text = text
    self.length = len(text)
    self.offset = offset
    self.line_offset = line_offset
    self.line_num = line_num
    self.doc_comments = doc_comments

copy

copy() -> Lexer

Make copy of self. New scanner at the same position as self.

Source code in src/raesl/compile/scanner.py
def copy(self) -> "Lexer":
    """Make copy of self. New scanner at the same position as self."""
    return Lexer(
        self.fname,
        self.text,
        self.offset,
        self.line_offset,
        self.line_num,
        self.doc_comments,
    )

find

find(tok_type: str) -> Optional[Token]

Try to find the requested token.

Parameters:

Name Type Description Default
tok_type str

Type name of the token.

required

Returns:

Type Description
Optional[Token]

Found token, or None.

Source code in src/raesl/compile/scanner.py
def find(self, tok_type: str) -> Optional[Token]:
    """Try to find the requested token.

    Arguments:
        tok_type: Type name of the token.

    Returns:
        Found token, or None.
    """
    self.skip_white()
    pat = TOKENS.get(tok_type)
    if pat:
        match = pat.match(self.text, self.offset)
        if not match:
            logger.debug("Lexer failed {}".format(tok_type))
            return None

        tok = Token(
            tok_type,
            match[0],
            self.fname,
            self.offset,
            self.line_offset,
            self.line_num,
        )
        self.offset = match.end()
        logger.debug("Lexer matched {}".format(tok))
        return tok

    elif tok_type == "NL_TK":
        if self.offset < self.length and self.text[self.offset] == "\n":
            tok = Token(
                tok_type,
                "\n",
                self.fname,
                self.offset,
                self.line_offset,
                self.line_num,
            )
            self.offset = self.offset + 1
            self.line_offset = self.offset
            self.line_num = self.line_num + 1
            logger.debug("Lexer matched {}".format(tok))
            return tok

        logger.debug("Lexer failed {}".format(tok_type))
        return None

    else:
        assert tok_type == "EOF_TK", "Found unrecognized token {}.".format(tok_type)
        if self.offset >= self.length:
            tok = Token(
                tok_type,
                "",
                self.fname,
                self.length,
                self.line_offset,
                self.line_num,
            )
            logger.debug("Lexer matched {}".format(tok))
            return tok

        logger.debug("Lexer failed {}".format(tok_type))
        return None

get_linecol

get_linecol() -> Tuple[int, int]

Get line and column information of the next token. Note that as new-lines are significant, such a position may be at an unexpected place, for example at the end of a comment.

Returns:

Type Description
Tuple[int, int]

Line and column information of the next token.

Source code in src/raesl/compile/scanner.py
def get_linecol(self) -> Tuple[int, int]:
    """Get line and column information of the next token. Note that as new-lines
    are significant, such a position may be at an unexpected place, for example at
    the end of a comment.

    Returns:
        Line and column information of the next token.
    """
    return self.line_num, self.offset - self.line_offset

get_location

get_location() -> Location

Get location information of the next token. Note that such a position may be at an unexpected place since new-lines are significant. For example, it may be at the end of a comment.

Returns:

Type Description
Location

Location information of the next token.

Source code in src/raesl/compile/scanner.py
def get_location(self) -> Location:
    """Get location information of the next token. Note that such a position may be
    at an unexpected place since new-lines are significant. For example, it may be
    at the end of a comment.

    Returns:
        Location information of the next token.
    """
    fname = self.fname if self.fname is not None else "unknown-file"
    line, col = self.get_linecol()
    return utils.get_location(uri=fname, start_line=line, start_character=col)

skip_white

skip_white()

Skip white space, triple dots, newlines, and comments. Implements the following Graphviz diagram:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
   digraph white {
       1 -> 1 [label="spc+"]
       1 -> 99 [label="eof"]
       1 -> 4 [label="#.*"]
       1 -> 5 [label="..."]
       1 -> 99 [label="other"]

       4 -> 99 [label="eof"]
       4 -> 99 [label="nl"]

       5 -> 5 [label="spc+"]
       5 -> 99 [label="eof"]
       5 -> 1 [label="nl"]
       5 -> 6 [label="#.*"]
       5 -> REV [label="..."]
       5 -> REV [label="other"]

       6 -> 99 [label="eof"]
       6 -> 1 [label="nl"]
   }

   Jump to non-99 location eats the recognized text, REV means the
   last found "..." was a false positive and must be reverted to just
   before that position.

   Note that

is a significant token, so it is not skipped everywhere.

Source code in src/raesl/compile/scanner.py
def skip_white(self):
    """Skip white space, triple dots, newlines, and comments. Implements the
    following Graphviz diagram:

    digraph white {
        1 -> 1 [label="spc+"]
        1 -> 99 [label="eof"]
        1 -> 4 [label="#.*"]
        1 -> 5 [label="..."]
        1 -> 99 [label="other"]

        4 -> 99 [label="eof"]
        4 -> 99 [label="nl"]

        5 -> 5 [label="spc+"]
        5 -> 99 [label="eof"]
        5 -> 1 [label="nl"]
        5 -> 6 [label="#.*"]
        5 -> REV [label="..."]
        5 -> REV [label="other"]

        6 -> 99 [label="eof"]
        6 -> 1 [label="nl"]
    }

    Jump to non-99 location eats the recognized text, REV means the
    last found "..." was a false positive and must be reverted to just
    before that position.

    Note that \n is a significant token, so it is not skipped everywhere.
    """
    while True:
        # 1:
        match = SPACE_RE.match(self.text, self.offset)
        if match:
            self.offset = match.end(0)

        if self.offset >= self.length:
            return

        if self.text[self.offset] == "#":
            # 4, starting with matching ".*":
            i = self.text.find("\n", self.offset + 1)
            self._save_doc_comment(self.offset, i)
            if i < 0:
                self.offset = self.length
                return
            else:
                self.offset = i  # A '\n' is needed to end the current line in the parser!
                return

        if self.text.startswith("...", self.offset):
            # 5:
            # Switch to using 'tmp_offset' as the offset, as we may have
            # to revert skipping ... .
            tmp_offset = self.offset + 3
            match = SPACE_RE.match(self.text, tmp_offset)
            if match:
                tmp_offset = match.end(0)

            if tmp_offset >= self.length:
                self.offset = self.length
                return

            char = self.text[tmp_offset]
            if char == "\n":  # "... \n" found
                self.offset = tmp_offset + 1
                self.line_offset = self.offset
                self.line_num = self.line_num + 1
                continue

            if char == "#":  # "... #.*\n" found?
                # 6, starting with matching ".*":
                i = self.text.find("\n", tmp_offset + 1)
                self._save_doc_comment(tmp_offset, i)
                if i < 0:
                    self.offset = self.length
                    return
                else:
                    self.offset = i + 1
                    self.line_offset = self.offset
                    self.line_num = self.line_num + 1
                    continue

            # Continuation of 5.
            # Found more text, '...' was a false positive, don't skip it.
            return

        # Continuation of 1, 'other' case
        return

Token

Token(
    tok_type: str,
    tok_text: str,
    fname: Optional[str],
    offset: int,
    line_offset: int,
    line_num: int,
)

Data of a matched token.

Parameters:

Name Type Description Default
tok_type str

Type name of the token.

required
tok_text str

Text of the token.

required
fname Optional[str]

Name of the file containing the text.

required
offset int

Offset of the current position in the input text.

required
line_offset int

Offset of the first character of the current line in the input text.

required
line_num int

Line number of the current line.

required
Source code in src/raesl/compile/scanner.py
def __init__(
    self,
    tok_type: str,
    tok_text: str,
    fname: Optional[str],
    offset: int,
    line_offset: int,
    line_num: int,
):
    self.tok_type = tok_type
    self.tok_text = tok_text
    self.fname = fname
    self.offset = offset
    self.line_offset = line_offset
    self.line_num = line_num

get_location

get_location(offset: int = 0) -> Location

Get this token's Location.

Source code in src/raesl/compile/scanner.py
def get_location(self, offset: int = 0) -> Location:
    """Get this token's Location."""
    fname = self.fname if self.fname is not None else "unknown-file"

    if offset < 0:
        offset = 0
    elif offset >= len(self.tok_text):
        offset = len(self.tok_text)

    line, col = self.line_num, self.offset - self.line_offset
    return utils.get_location(
        uri=fname,
        start_line=line,
        start_character=col,
        end_line=line,
        end_character=col + offset,
    )

get_token_priority

get_token_priority(tok_type: str) -> int

Priority of the tokens. Higher value is less specific.

Parameters:

Name Type Description Default
tok_type str

Name of the token type.

required

Returns:

Type Description
int

Priority of the token.

Source code in src/raesl/compile/scanner.py
def get_token_priority(tok_type: str) -> int:
    """Priority of the tokens. Higher value is less specific.

    Arguments:
        tok_type: Name of the token type.

    Returns:
        Priority of the token.
    """
    tok_prios = {"NONSPACE": 4, "NONCOMMA": 3, "NAME": 2, "DOTTEDNAME": 2, "epsilon": 0}
    return tok_prios.get(tok_type, 1)  # Default priority is 1

state_machine

State machine classes to describe an allowed sequences of tokens.

A state machine is a DFA (deterministic finite automaton, always at most one edge that matches). An edge is associated with a matched token, locations are decision points between tokens.

An edge may tag occurrences of tokens to simplify extraction of relevant information for future compiler phases. A location may record matching of a valid sequence of tokens by accepting.

Edge

Edge(
    dest: Location,
    tok_type: str,
    tag_name: Optional[str] = None,
)

Edge to a next location.

Parameters:

Name Type Description Default
dest Location

Destination of the edge.

required
tok_type str

The value of the 'tok_type' attribute of a token that can trigger this transition.

required
tag_name Optional[str]

If not None, the name to use for recording the transition in the state machine.

None
Source code in src/raesl/compile/state_machine.py
def __init__(self, dest: Location, tok_type: str, tag_name: Optional[str] = None):
    self.tok_type = tok_type
    self.tag_name = tag_name
    self.dest = dest
    assert isinstance(dest, Location)

Location

Location(name: str, accept: Optional[str] = None)

Location in a state machine.

Parameters:

Name Type Description Default
accept Optional[str]

Name of the rule that could be accepted at this location. If None, no such rule exists.

None
name str

Name of the location, mostly for identifying purposes.

required

Attributes:

Name Type Description
out_edges List[Edge]

Outgoing edges, initially empty.

Source code in src/raesl/compile/state_machine.py
def __init__(self, name: str, accept: Optional[str] = None):
    self.accept = accept
    self.name = name
    self.out_edges: List[Edge] = []

MatchResult

MatchResult(
    accepted_name: Optional[str],
    accepted_tags: Dict[str, List[Token]],
    lexer: Lexer,
)

Result data of a matching process in StateMachine.match().

Parameters:

Name Type Description Default
accepted_name Optional[str]

Acceptance name from the last visited accepting location.

required
accepted_tags Dict[str, List[Token]]

Collected tag data at the point of accepting.

required
lexer Lexer

Lexer at the time of the last fail or at the time of the last accept

required
Source code in src/raesl/compile/state_machine.py
def __init__(
    self,
    accepted_name: Optional[str],
    accepted_tags: Dict[str, List["Token"]],
    lexer: "Lexer",
):
    self.accepted_name = accepted_name
    self.accepted_tags = accepted_tags
    self.lexer = lexer

StateMachine

StateMachine(name: str)

State machine containing locations and edges.

Note that it only stores the initial location, all other locations and edges are reachable from it.

Parameters:

Name Type Description Default
name str

Name of the state machine, also the name of the matched sequence.

required

Attributes:

Name Type Description
initial_loc Optional[Location]

Initial location of the state machine. Set after construction.

Source code in src/raesl/compile/state_machine.py
def __init__(self, name: str):
    self.name = name
    self.initial_loc: Optional[Location] = None

dump

dump(fname: Optional[str] = None)

Dump the state machine to a file in Graphviz format.

Parameters:

Name Type Description Default
fname Optional[str]

If not None, name of the file to write, else a filename is constructed from the name of the state machine.

None
Source code in src/raesl/compile/state_machine.py
def dump(self, fname: Optional[str] = None):
    """Dump the state machine to a file in Graphviz format.

    Arguments:
        fname: If not None, name of the file to write, else a filename is
            constructed from the name of the state machine.
    """
    processed_locs: Set[Location] = set()
    notdone = [self.initial_loc]
    lines = ["digraph G {"]
    while notdone:
        loc = notdone.pop()
        if loc is None or loc in processed_locs:
            continue

        processed_locs.add(loc)
        if loc.accept:
            lines.append(f'    {loc.name} [shape="box"]')
        else:
            lines.append(f"    {loc.name}")

        for edge in loc.out_edges:
            lines.append(f'    {loc.name} -> {edge.dest.name} [label="{edge.tok_type}"]')
            notdone.append(edge.dest)

    lines.append("}")

    if fname is None:
        fname = self.name + ".dot"
    with open(fname, "w") as handle:
        for line in lines:
            handle.write(line)
            handle.write("\n")

match

match(lexer: Lexer) -> Optional[MatchResult]

Try to match the machine against tokens from the scanner.

Parameters:

Name Type Description Default
lexer Lexer

Token stream to match. Instance is useless afterwards, make a copy beforehand if you need it again.

required

Returns:

Type Description
Optional[MatchResult]

Result of the matching process.

Note

This routine is currently only used for testing.

Source code in src/raesl/compile/state_machine.py
def match(self, lexer: "Lexer") -> Optional["MatchResult"]:
    """Try to match the machine against tokens from the scanner.

    Arguments:
        lexer: Token stream to match. Instance is useless afterwards, make a copy
            beforehand if you need it again.

    Returns:
        Result of the matching process.

    Note:
        This routine is currently only used for testing.
    """
    assert self.initial_loc
    current_loc: Location = self.initial_loc
    tags: Dict[str, List["Token"]] = {}

    accepted_name = None
    accepted_tags: Dict[str, List["Token"]] = {}
    accepted_lexer: Optional["Lexer"] = None

    while True:
        # Keep stepping until no progress is possible any more.
        match = self.single_step(lexer, current_loc, tags)
        if not match:
            break

        current_loc = match[0]

        # Update acceptance if necessary.
        if current_loc.accept is not None:
            accepted_name = current_loc.accept
            accepted_tags = dict((k, v.copy()) for k, v in tags.items())
            accepted_lexer = lexer.copy()

    # Done, return the result, either just the lexer at the failed state
    # for its position information, or the last accept.
    if accepted_name is None:
        return MatchResult(None, {}, lexer)

    assert accepted_lexer
    return MatchResult(accepted_name, accepted_tags, accepted_lexer)

single_step

single_step(
    lexer: Lexer,
    current_loc: Location,
    tags: Dict[str, List[Token]],
) -> Optional[Tuple[Location, Token]]

Try to perform a single step in the state machine.

Parameters:

Name Type Description Default
lexer Lexer

Token stream to match. Instance is modified in-place if a transition is taken.

required
current_loc Location

Location to use for finding edges to try.

required
tags Dict[str, List[Token]]

Collected tags so far, may be updated in-place if transition was performed.

required

Returns:

Type Description
Optional[Tuple[Location, Token]]

New location and the matching token if a transition could be performed,

Optional[Tuple[Location, Token]]

else None.

Source code in src/raesl/compile/state_machine.py
def single_step(
    self, lexer: "Lexer", current_loc: Location, tags: Dict[str, List["Token"]]
) -> Optional[Tuple["Location", "Token"]]:
    """Try to perform a single step in the state machine.

    Arguments:
        lexer: Token stream to match. Instance is modified in-place if a transition
            is taken.
        current_loc: Location to use for finding edges to try.
        tags: Collected tags so far, may be updated in-place if transition was
            performed.

    Returns:
        New location and the matching token if a transition could be performed,
        else None.
    """
    if not current_loc.out_edges:
        return None

    for edge in current_loc.out_edges:
        token = lexer.find(edge.tok_type)
        if token is None:
            continue

        # Match found. Due to being a DFA and edges being sorted on priority,
        # this is also the one and only match that we should find for this machine.
        #
        # Update tags
        if edge.tag_name is not None:
            edge_tag = tags.get(edge.tag_name)
            if edge_tag is None:
                tags[edge.tag_name] = [token]
            else:
                edge_tag.append(token)

        # Pass target location back to the caller.
        return edge.dest, token

    return None

sort_edges

sort_edges()

Sort edges of the state machine to get specific tokens checked first.

Source code in src/raesl/compile/state_machine.py
def sort_edges(self):
    """Sort edges of the state machine to get specific tokens checked first."""
    done_locs = set([self.initial_loc])
    found_locs = [self.initial_loc]
    while found_locs:
        loc = found_locs.pop()
        loc.out_edges.sort(key=lambda edge: get_token_priority(edge.tok_type))
        for edge in loc.out_edges:
            if edge.dest not in done_locs:
                done_locs.add(edge.dest)
                found_locs.append(edge.dest)

typechecking

Type checking for ESL.

ast_builder

Classes to collect and store information from the parsing process, perform checking on the information for being correct, and construct an AST as result along with a log of found diagnostics.

The AstBuilder operates at top section level (types, verbs, relation definitions, and component definitions). It leaves all the details of each section to dedicated child builders (thus creating a highly modular checker), and acts as call dispatcher and global controller in the type-checking and ast building process once parsing has finished.

Notable parts in the class are - Child builders for each top section part. - Diagnostics store shared with all the child builders. - Storage of doc-comments in the input for attaching them to the correct parts of the produced specification after parsing. - Call dispatcher for the child builders that a new top or sub-section has been found, allowing them to clean up processing if needed. - Entry points for the parser to push found information to the child builders. - The 'finish_parse' entry point to perform all type checking, and produce the AST and found diagnostics.

AstBuilder

AstBuilder(diag_store: DiagnosticStore)

Builder to collect information from the parse process, perform type checking, and produce an AST and reported diagnostics.

Parameters:

Name Type Description Default
diag_store DiagnosticStore

Storage for diagnostics while building the AST.

required

Attributes:

Name Type Description
doc_distributor

Object that distributes doc comments to interested elements of the specification.

section_notify_list List[Union[TypeBuilder, RelationDefBuilder, ComponentDefBuilder]]

Builders to notify of a new section.

type_builder

Builder for constructing types.

verb_builder

Builder for constructing verb/prepositions.

reldef_builder

Builder for constructing relations.

compdef_builder

Builder for constructing components.

Source code in src/raesl/compile/typechecking/ast_builder.py
def __init__(self, diag_store: diagnostics.DiagnosticStore):
    self.diag_store = diag_store
    self.doc_distributor = DocCommentDistributor(self.diag_store)
    self.section_notify_list: List[
        Union[TypeBuilder, RelationDefBuilder, ComponentDefBuilder]
    ] = []

    self.type_builder = TypeBuilder(self)
    self.verb_builder = VerbDefBuilder(self)
    self.reldef_builder = RelationDefBuilder(self)
    self.compdef_builder = ComponentDefBuilder(self)
add_bundle_field
add_bundle_field(
    field_name: Token, type_name: Optional[Token]
)

Forward call to type builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def add_bundle_field(
    self,
    field_name: "Token",
    type_name: Optional["Token"],
):
    """Forward call to type builder."""
    self.type_builder.add_bundle_field(
        field_name,
        type_name,
    )
add_reldef
add_reldef(name: Token)

Forward call to relation definition builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def add_reldef(self, name: "Token"):
    """Forward call to relation definition builder."""
    self.reldef_builder.add_reldef(name)
add_typedef
add_typedef(
    type_name: Token,
    parent_name: Optional[Token],
    enum_spec: Optional[List[Value]],
    unit_spec: Optional[List[Token]],
    ival_spec: Optional[
        List[Tuple[Optional[Value], Optional[Value]]]
    ],
    cons_spec: Optional[Value],
)

Forward call to type builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def add_typedef(
    self,
    type_name: "Token",
    parent_name: Optional["Token"],
    enum_spec: Optional[List[exprs.Value]],
    unit_spec: Optional[List["Token"]],
    ival_spec: Optional[List[Tuple[Optional[exprs.Value], Optional[exprs.Value]]]],
    cons_spec: Optional[exprs.Value],
):
    """Forward call to type builder."""
    self.type_builder.add_typedef(
        type_name, parent_name, enum_spec, unit_spec, ival_spec, cons_spec
    )
add_verbdef
add_verbdef(verb_tok: Token, prepos_tok: Token)

Forward call to verb definition builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def add_verbdef(self, verb_tok: "Token", prepos_tok: "Token"):
    """Forward call to verb definition builder."""
    self.verb_builder.add_verbdef(verb_tok, prepos_tok)
finish_parse
finish_parse(
    doc_comments: List[Token],
) -> Optional[Specification]

Finish processing the collected information, that is, perform type checking.

Parameters:

Name Type Description Default
doc_comments List[Token]

Raw documentation comments rescued from the scanner.

required
Source code in src/raesl/compile/typechecking/ast_builder.py
def finish_parse(self, doc_comments: List["Token"]) -> Optional[specification.Specification]:
    """Finish processing the collected information, that is, perform type checking.

    Arguments:
        doc_comments: Raw documentation comments rescued from the scanner.
    """
    # Tell all builders current section is done.
    self.notify_new_section(None, True)

    # Convert collected information to AST.
    spec = specification.Specification()
    self.verb_builder.finish(spec)
    self.type_builder.finish(spec)
    self.reldef_builder.finish(spec)  # Requires types.
    self.compdef_builder.finish(spec, self.doc_distributor)  # Requires types, verbs, reldefs.

    # Add specification elements to the doc distributor.
    for elem in specification.get_doc_comment_spec_elements(spec):
        self.doc_distributor.add_element(elem)

    # Hand out all doc comments.
    self.doc_distributor.resolve(doc_comments)
    return spec
new_bundle_type
new_bundle_type(bundle_name: Token)

Forward call to type builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def new_bundle_type(self, bundle_name: "Token"):
    """Forward call to type builder."""
    self.type_builder.new_bundle_type(bundle_name)
notify_new_section
notify_new_section(
    tok: Optional[Token], new_top_section: bool
)

Parser has started a new section, finish all 'in-progress' definitions.

Parameters:

Name Type Description Default
tok Optional[Token]

Token indicating the position of the new section. None is used for EOF.

required
new_top_section bool

If set, a new type, verbs, or component definition has been found, else a new section within a component has been detected.

required
Source code in src/raesl/compile/typechecking/ast_builder.py
def notify_new_section(self, tok: Optional["Token"], new_top_section: bool):
    """Parser has started a new section, finish all 'in-progress' definitions.

    Arguments:
        tok: Token indicating the position of the new section. None is used for EOF.
        new_top_section: If set, a new type, verbs, or component definition has
            been found, else a new section within a component has been detected.
    """
    if tok:
        # New section started, documentation after this point doesn't belong
        # to a previous element.
        self.doc_distributor.add_dummy_element(tok)

    for builder in self.section_notify_list:
        builder.notify_new_section(new_top_section)
register_new_section
register_new_section(other_builder)

Entry point for a child builder to declare interest in receiving notifications about new sections in the file.

Source code in src/raesl/compile/typechecking/ast_builder.py
def register_new_section(self, other_builder):
    """Entry point for a child builder to declare interest in receiving
    notifications about new sections in the file.
    """
    self.section_notify_list.append(other_builder)
reldef_add_param
reldef_add_param(
    name: Token, type_name: Token, multi: bool
)

Forward call to relation definition builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def reldef_add_param(self, name: "Token", type_name: "Token", multi: bool):
    """Forward call to relation definition builder."""
    self.reldef_builder.reldef_add_param(name, type_name, multi)
reldef_param_header
reldef_param_header(header_tok: Token, direction: str)

Forward call to relation definition builder.

Source code in src/raesl/compile/typechecking/ast_builder.py
def reldef_param_header(self, header_tok: "Token", direction: str):
    """Forward call to relation definition builder."""
    self.reldef_builder.reldef_param_header(header_tok, direction)

compdef_behavior_builder

Code for collecting and adding behavior sections to component definitions.

CompDefBehaviorBuilder

CompDefBehaviorBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Class for constructing and checking behavior functions.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required

Attributes:

Name Type Description
behavior_kind Optional[str]

Last seen kind of behavior ('requirement' or 'constraint').

expect_conds

Whether the builder should allow conditions to be received from the parser.

expect_results

Whether the builder should allow results to be received from the parser.

pbehaviors List[ParsedBehavior]

Collected behaviors.

Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.comp_child_builders = comp_child_builders

    self.behavior_kind: Optional[str] = None
    self.expect_conds = False
    self.expect_results = False
    self.pbehaviors: List[ParsedBehavior] = []  # Using List[ParsedCase] for its 'case's.
behavior_case
behavior_case(case_label_tok: Token)

A new case of the last started behavior functionality was found.

Parameters:

Name Type Description Default
case_label_tok Token

Name of the case.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_case(self, case_label_tok: "Token"):
    """A new case of the last started behavior functionality was found.

    Arguments:
        case_label_tok: Name of the case.
    """
    parsed_case = ParsedCase(case_label_tok, None, None, [], [])
    self.pbehaviors[-1].cases.append(parsed_case)

    self.expect_conds = False
    self.expect_results = False
behavior_default_when
behavior_default_when(when_tok: Token)

The start of a default condition block was found.

Parameters:

Name Type Description Default
when_tok Token

Position of the start of the new block.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_default_when(self, when_tok: "Token"):
    """The start of a default condition block was found.

    Arguments:
        when_tok: Position of the start of the new block.
    """
    assert self.pbehaviors[-1].cases[-1].when_tok is None
    self.pbehaviors[-1].cases[-1].when_tok = when_tok

    # Ensure code will crash if you add 'whens'.
    self.pbehaviors[-1].cases[-1].whens = None

    # We don't expect more 'when', and we must see a 'then' first.
    self.expect_conds = False
    self.expect_results = False
behavior_normal_then
behavior_normal_then(then_tok: Token)

The start of a 'then' result block was found.

Parameters:

Name Type Description Default
then_tok Token

Position of the start of the new block.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_normal_then(self, then_tok: "Token"):
    """The start of a 'then' result block was found.

    Arguments:
        then_tok: Position of the start of the new block.
    """
    assert self.pbehaviors[-1].cases[-1].then_tok is None
    self.pbehaviors[-1].cases[-1].then_tok = then_tok

    # No need to setup case[-1].thens, as behavior_case() already did that.
    self.expect_conds = False
    self.expect_results = True
behavior_normal_when
behavior_normal_when(when_tok: Token)

The start of a normal 'when' condition block was found.

Parameters:

Name Type Description Default
when_tok Token

Position of the start of the new block.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_normal_when(self, when_tok: "Token"):
    """The start of a normal 'when' condition block was found.

    Arguments:
        when_tok: Position of the start of the new block.
    """
    assert self.pbehaviors[-1].cases[-1].when_tok is None
    self.pbehaviors[-1].cases[-1].when_tok = when_tok

    # No need to setup case[-1].whens, as behavior_case() already did that.
    self.expect_conds = True
    self.expect_results = False
behavior_then_result
behavior_then_result(name_tok: Token, result: Comparison)

A new result was found, add it to the last 'then' block.

Parameters:

Name Type Description Default
name_tok Token

Name of the result.

required
result Comparison

Result to add.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_then_result(self, name_tok: "Token", result: "Comparison"):
    """A new result was found, add it to the last 'then' block.

    Arguments:
        name_tok: Name of the result.
        result: Result to add.
    """
    assert self.expect_results
    self.pbehaviors[-1].cases[-1].thens.append((name_tok, result))
behavior_when_condition
behavior_when_condition(
    name_tok: Token,
    condition: Union[Disjunction, RelationComparison],
)

A new condition was found, add it to the last 'when' block.

Parameters:

Name Type Description Default
name_tok Token

Name of the condition.

required
condition Union[Disjunction, RelationComparison]

Condition to add.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def behavior_when_condition(
    self, name_tok: "Token", condition: Union["Disjunction", "RelationComparison"]
):
    """A new condition was found, add it to the last 'when' block.

    Arguments:
        name_tok: Name of the condition.
        condition: Condition to add.
    """
    assert self.expect_conds
    assert self.pbehaviors[-1].cases[-1].whens is not None
    self.pbehaviors[-1].cases[-1].whens.append((name_tok, condition))
finish_comp
finish_comp(
    comp_def: ComponentDefinition, _spec: Specification
)

Verify correctness of the collected behavior and store good behavior in :obj:comp_def.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Surrounding component definition supplying variables and parameters. Checked designs should be added to it after checking.

required
_spec Specification

Specification being constructed, source for types and verbs.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", _spec: "Specification"):
    """Verify correctness of the collected behavior and store good behavior in
    :obj:`comp_def`.

    Arguments:
        comp_def: Surrounding component definition supplying variables and
            parameters. Checked designs should be added to it after checking.
        _spec: Specification being constructed, source for types and verbs.
    """
    vps = utils.construct_var_param_map(comp_def)
    expr_checker = ExprChecker(vps, self.diag_store)

    beh_funcs = []
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for pbeh in self.pbehaviors:
        # Store labels of functions for duplicate checking.
        elements_by_label[pbeh.name.tok_text].append(pbeh)

        beh_func = components.BehaviorFunction(pbeh.kind, pbeh.name)
        default_cases: List["Token"] = []  # Positions of default cases in this function.
        cases_ordered_by_label: Dict[str, List["Token"]] = defaultdict(list)
        for pcase in pbeh.cases:
            cases_ordered_by_label[pcase.name.tok_text].append(pcase.name)

            # Process conditions.
            if pcase.whens is None:
                # Default case.
                assert pcase.when_tok is not None
                default_cases.append(pcase.when_tok)
                conditions = None
            else:
                # Normal case.
                conditions = self._convert_conditions(pcase.whens, expr_checker)

            # Process results.
            results = self._convert_results(pcase.thens, expr_checker)

            # Add case to the function.
            if conditions is None:
                # Bluntly assume this happens at most once. If not, 'default_cases'
                # will detect it and give an error.
                beh_func.default_results = results
            else:
                beh_case = components.BehaviorCase(pcase.name, conditions, results)
                beh_func.cases.append(beh_case)

        # Verify uniqueness of cases.
        for dup_cases in cases_ordered_by_label.values():
            if len(dup_cases) > 1:
                # Duplicate case names.
                self.diag_store.add(
                    diagnostics.E200(
                        dup_cases[0].tok_text,
                        "behavior case",
                        location=pbeh.name.get_location(),
                        dupes=[dupe.get_location() for dupe in dup_cases],
                    )
                )

        # Check number of default cases.
        if len(default_cases) > 1:
            # Duplicate fallback cases.
            self.diag_store.add(
                diagnostics.E200(
                    pbeh.name.tok_text,
                    "fallback case",
                    location=pbeh.name.get_location(),
                    dupes=[dupe.get_location() for dupe in default_cases],
                )
            )

        beh_funcs.append(beh_func)

    comp_def.behaviors = beh_funcs
new_behavior_function
new_behavior_function(label_tok: Token)

A new behavior functionality was found.

Parameters:

Name Type Description Default
label_tok Token

Name of the new behavior.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def new_behavior_function(self, label_tok: "Token"):
    """A new behavior functionality was found.

    Arguments:
        label_tok: Name of the new behavior.
    """
    assert self.behavior_kind is not None
    parsed_beh = ParsedBehavior(label_tok, self.behavior_kind, [])
    self.pbehaviors.append(parsed_beh)

    self.expect_conds = False
    self.expect_results = False
new_behavior_header
new_behavior_header(kind_tok: Token)

A new 'behavior' section header was found.

Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def new_behavior_header(self, kind_tok: "Token"):
    """A new 'behavior' section header was found."""
    if kind_tok.tok_type == "BEHAVIOR_REQUIREMENT_KW":
        self.behavior_kind = components.REQUIREMENT
    else:
        assert kind_tok.tok_type == "BEHAVIOR_CONSTRAINT_KW"
        self.behavior_kind = components.CONSTRAINT

    self.expect_conds = False
    self.expect_results = False

ParsedBehavior

ParsedBehavior(
    name: Token, kind: str, cases: List[ParsedCase]
)

Temporary storage of a behavior. This allows catching multiple default cases (these cannot be expressed in the AST). Also, it allows checking for receiving a sane order of calls from the parser.

Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def __init__(self, name: "Token", kind: str, cases: List["ParsedCase"]):
    self.name = name
    self.kind = kind
    self.cases = cases

ParsedCase

ParsedCase(
    name: Token,
    when_tok: Optional[Token],
    then_tok: Optional[Token],
    whens: WhensType,
    thens: ThensType,
)

Temporary storage for storing a case in a behavior while collecting the cases.

Parameters:

Name Type Description Default
name Token

Name of the case.

required
when_tok Optional[Token]

Position of the 'when' line.

required
then_tok Optional[Token]

Position of the 'then' line.

required
whens WhensType

Collected conditions. Will be empty for the default behavior. None means the variable should not be accessed at all.

required
thens ThensType

Collected results.

required
Source code in src/raesl/compile/typechecking/compdef_behavior_builder.py
def __init__(
    self,
    name: "Token",
    when_tok: Optional["Token"],
    then_tok: Optional["Token"],
    whens: WhensType,
    thens: ThensType,
):
    self.name = name
    self.when_tok = when_tok
    self.then_tok = then_tok
    self.whens: Optional[WhensType] = whens
    self.thens = thens

compdef_builder

Builder for collecting and building component definitions.

CompDefChildBuilders

CompDefChildBuilders(
    compdef_builder: ComponentDefBuilder,
    pos_tok: Token,
    name_tok: Optional[Token],
    varparam_counter: Counter,
)

Class storing child builders for all sections of a component definition.

As type checking cannot be done until the entire specification has been parsed (global types, verbs, and relation definitions may not exist at the time of the end of a component definition, and definitions of instantiated component may be defined further down in the specification), child builders for each definition must be kept around until the end.

Parameters:

Name Type Description Default
compdef_builder ComponentDefBuilder

Parent component definition builder.

required
pos_tok Token

Position of the start of the component definition.

required
name_tok Optional[Token]

Name of the component definition if it exists

required
varparam_counter Counter

Object for handing out unique numbers to elementary var/param nodes.

required
Source code in src/raesl/compile/typechecking/compdef_builder.py
def __init__(
    self,
    compdef_builder: "ComponentDefBuilder",
    pos_tok: "Token",
    name_tok: Optional["Token"],
    varparam_counter: Counter,
):
    self.diag_store: diagnostics.DiagnosticStore = compdef_builder.diag_store

    self.pos_tok = pos_tok
    self.name_tok = name_tok
    self.elements_by_label = collections.defaultdict(list)

    # Builders for specific parts of the component.
    self.varparam_builder = CompDefVarParamBuilder(self, varparam_counter)
    self.vargroup_builder = CompDefVarGroupBuilder(self)
    self.compinst_builder = CompDefCompInstBuilder(self)
    self.relinst_builder = CompDefRelInstBuilder(self)
    self.goal_builder = CompDefGoalBuilder(self)
    self.transform_builder = CompDefTransformBuilder(self)
    self.behavior_builder = CompDefBehaviorBuilder(self)
    self.design_builder = CompDefDesignBuilder(self)
    self.need_builder = CompDefNeedBuilder(self)
    self.comment_builder = CompDefCommentBuilder(self)
add_comment
add_comment(name_tok: Token)

Forward call to comment builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_comment(self, name_tok: "Token"):
    """Forward call to comment builder."""
    self.comment_builder.add_comment(name_tok)
add_compinst
add_compinst(
    inst_name_tok: Token,
    def_name_tok: Token,
    has_arguments: bool,
)

Forward call to component instance builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_compinst(self, inst_name_tok: "Token", def_name_tok: "Token", has_arguments: bool):
    """Forward call to component instance builder."""
    self.compinst_builder.add_compinst(inst_name_tok, def_name_tok, has_arguments)
add_compinst_arguments
add_compinst_arguments(arguments: List[Token])

Forward call to component instance builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_compinst_arguments(self, arguments: List["Token"]):
    """Forward call to component instance builder."""
    self.compinst_builder.add_compinst_arguments(arguments)
add_design_subclause
add_design_subclause(sub: SubClause)

Forward call to design builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_design_subclause(self, sub: components.SubClause):
    """Forward call to design builder."""
    self.design_builder.add_design_subclause(sub)
add_goal
add_goal(goal: Goal)

Forward call to goal builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_goal(self, goal: components.Goal):
    """Forward call to goal builder."""
    self.goal_builder.add_goal(goal)
add_goal_subclause
add_goal_subclause(sub_clause: SubClause)

Forward call to goal builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_goal_subclause(self, sub_clause: components.SubClause):
    """Forward call to goal builder."""
    self.goal_builder.add_goal_subclause(sub_clause)
add_need
add_need(
    label_tok: Token, subject_tok: Token, description: str
)

Forward call to need builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_need(self, label_tok: "Token", subject_tok: "Token", description: str):
    """Forward call to need builder."""
    self.need_builder.add_need(label_tok, subject_tok, description)
add_parameters
add_parameters(new_params: List[VarParam])

Forward call to variable & parameter builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_parameters(self, new_params: List[components.VarParam]):
    """Forward call to variable & parameter builder."""
    self.varparam_builder.add_parameters(new_params)
add_relinst_arguments
add_relinst_arguments(name_toks: List[Token])

Forward call to relation instance builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_relinst_arguments(self, name_toks: List["Token"]):
    """Forward call to relation instance builder."""
    self.relinst_builder.add_relinst_arguments(name_toks)
add_transform
add_transform(transform: Transformation)

Forward call to transform builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_transform(self, transform: components.Transformation):
    """Forward call to transform builder."""
    self.transform_builder.add_transform(transform)
add_transform_subclause
add_transform_subclause(sub_clause: SubClause)

Forward call to transform builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_transform_subclause(self, sub_clause: components.SubClause):
    """Forward call to transform builder."""
    self.transform_builder.add_transform_subclause(sub_clause)
add_variables
add_variables(new_vars: List[VarParam])

Forward call to variable & parameter builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def add_variables(self, new_vars: List[components.VarParam]):
    """Forward call to variable & parameter builder."""
    self.varparam_builder.add_variables(new_vars)
behavior_case
behavior_case(case_label_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_case(self, case_label_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_case(case_label_tok)
behavior_default_when
behavior_default_when(when_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_default_when(self, when_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_default_when(when_tok)
behavior_normal_then
behavior_normal_then(then_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_normal_then(self, then_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_normal_then(then_tok)
behavior_normal_when
behavior_normal_when(when_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_normal_when(self, when_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_normal_when(when_tok)
behavior_then_result
behavior_then_result(name_tok: Token, result: Comparison)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_then_result(self, name_tok: "Token", result: "exprs.Comparison"):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_then_result(name_tok, result)
behavior_when_condition
behavior_when_condition(
    name_tok: Token,
    condition: Union[Disjunction, RelationComparison],
)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def behavior_when_condition(
    self,
    name_tok: "Token",
    condition: Union["exprs.Disjunction", "exprs.RelationComparison"],
):
    """Forward call to behavior builder."""
    self.behavior_builder.behavior_when_condition(name_tok, condition)
design_line
design_line(design: Design)

Forward call to design builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def design_line(self, design: components.Design):
    """Forward call to design builder."""
    self.design_builder.design_line(design)
finish
finish(
    spec: Specification,
    doc_distributor: DocCommentDistributor,
)

Parsing is finished, component child instances have been checked already. Check the collected component data, and add the component definition to the specification.

Parameters:

Name Type Description Default
spec Specification

Specification to use as source for types, verbs relation definitions, and other component definitions, and to fill with the type-checked component.

required
doc_distributor DocCommentDistributor

Object that accepts the found doc comments for distributing them to the elements of the specification.

required
Source code in src/raesl/compile/typechecking/compdef_builder.py
def finish(self, spec: "Specification", doc_distributor: "DocCommentDistributor"):
    """Parsing is finished, component child instances have been checked already.
    Check the collected component data, and add the component definition to the
    specification.

    Arguments:
        spec: Specification to use as source for types, verbs relation definitions,
            and other component definitions, and to fill with the type-checked
            component.
        doc_distributor: Object that accepts the found doc comments for distributing
            them to the elements of the specification.
    """
    comp_def = components.ComponentDefinition(self.pos_tok, self.name_tok)
    self.varparam_builder.finish_comp(comp_def, spec)
    self.vargroup_builder.finish_comp(comp_def, spec)
    self.compinst_builder.finish_comp(comp_def, spec)
    self.relinst_builder.finish_comp(comp_def, spec)
    self.goal_builder.finish_comp(comp_def, spec)
    self.transform_builder.finish_comp(comp_def, spec)
    self.behavior_builder.finish_comp(comp_def, spec)
    self.design_builder.finish_comp(comp_def, spec)
    self.need_builder.finish_comp(comp_def, spec)
    self.comment_builder.finish_comp(comp_def, doc_distributor)  # Must be final build step.

    # Check for unique labels within the scope of each comp_def.
    for labeled_elements in self.elements_by_label.values():
        if len(labeled_elements) > 1:
            locs = []
            for elem in labeled_elements:
                if isinstance(elem, (components.VarParam, CollectedVarGroup)):
                    locs.append(elem.name_tok.get_location())
                elif isinstance(elem, ParsedBehavior):
                    locs.append(elem.name.get_location())
                elif isinstance(elem, (RelInst, ComponentInstance)):
                    locs.append(elem.inst_name_tok.get_location())
                else:
                    locs.append(elem.label_tok.get_location())

            if self.name_tok:
                scope = "component definition {}".format(self.name_tok.tok_text)
            else:
                scope = "world"

            if hasattr(labeled_elements[-1], "label_tok"):
                name = labeled_elements[-1].label_tok.tok_text
            elif hasattr(labeled_elements[-1], "name_tok"):
                name = labeled_elements[-1].name_tok.tok_text
            elif hasattr(labeled_elements[-1], "name"):
                name = labeled_elements[-1].name.tok_text
            elif hasattr(labeled_elements[-1], "inst_name_tok"):
                name = labeled_elements[-1].inst_name_tok.tok_text
            else:
                name = ""

            self.diag_store.add(
                diagnostics.E227(
                    name,
                    scope,
                    location=locs[0],
                    dupes=locs,
                )
            )

    # Add all elements of the component to the documentation distributor.
    for elem in components.get_doc_comment_comp_elements(comp_def):
        doc_distributor.add_element(elem)

    if comp_def.name_tok is None:
        spec.world = comp_def
    else:
        spec.comp_defs.append(comp_def)
get_child_compdefnames
get_child_compdefnames() -> Set[str]

Get the names of component definitions that are needed for child instances.

Returns:

Type Description
Set[str]

Names of the components being instantiated in this component.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def get_child_compdefnames(self) -> Set[str]:
    """Get the names of component definitions that are needed for child instances.

    Returns:
        Names of the components being instantiated in this component.
    """
    return self.compinst_builder.get_compdef_names()
new_behavior_function
new_behavior_function(label_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_behavior_function(self, label_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.new_behavior_function(label_tok)
new_behavior_header
new_behavior_header(kind_tok: Token)

Forward call to behavior builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_behavior_header(self, kind_tok: "Token"):
    """Forward call to behavior builder."""
    self.behavior_builder.new_behavior_header(kind_tok)
new_design_header
new_design_header(kind: Token)

Forward call to design builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_design_header(self, kind: "Token"):
    """Forward call to design builder."""
    self.design_builder.new_design_header(kind)
new_goal_header
new_goal_header(goal_kind: Token)

Forward call to goal builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_goal_header(self, goal_kind: "Token"):
    """Forward call to goal builder."""
    self.goal_builder.new_goal_header(goal_kind)
new_relinst
new_relinst(inst_name_tok: Token, def_name_tok: Token)

Forward call to relation instance builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_relinst(self, inst_name_tok: "Token", def_name_tok: "Token"):
    """Forward call to relation instance builder."""
    self.relinst_builder.new_relinst(inst_name_tok, def_name_tok)
new_transform_header
new_transform_header(transform_kind: Token)

Forward call to transform builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_transform_header(self, transform_kind: "Token"):
    """Forward call to transform builder."""
    self.transform_builder.new_transform_header(transform_kind)
new_vargroup
new_vargroup(name_tok: Token)

Forward call to variable group builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_vargroup(self, name_tok: "Token"):
    """Forward call to variable group builder."""
    self.vargroup_builder.new_vargroup(name_tok)
notify_parameter_section
notify_parameter_section(pos_tok: Token)

A parameter section was found, check if it is allowed.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def notify_parameter_section(self, pos_tok: "Token"):
    """A parameter section was found, check if it is allowed."""
    if self.name_tok is None:  # We're processing 'world'
        # Parameter section not allowed in 'world'.
        self.diag_store.add(diagnostics.E201("parameter", "world", pos_tok.get_location()))
notify_transform_section
notify_transform_section(pos_tok: Token)

A transform section was found, check if it is allowed.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def notify_transform_section(self, pos_tok: "Token"):
    """A transform section was found, check if it is allowed."""
    if self.name_tok is None:  # We're processing 'world'
        # Transformation section not allowed in 'world'.
        self.diag_store.add(diagnostics.E201("transformation", "world", pos_tok.get_location()))
relinst_argheader
relinst_argheader(argkind_tok: Token)

Forward call to relation instance builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def relinst_argheader(self, argkind_tok: "Token"):
    """Forward call to relation instance builder."""
    self.relinst_builder.relinst_argheader(argkind_tok)
vgroup_add_vars
vgroup_add_vars(varname_toks: List[Token])

Forward call to variable group builder.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def vgroup_add_vars(self, varname_toks: List["Token"]):
    """Forward call to variable group builder."""
    self.vargroup_builder.vgroup_add_vars(varname_toks)

ComponentDefBuilder

ComponentDefBuilder(ast_builder: AstBuilder)

Builder to construct component definitions of the entire specification. The builder keeps a list of component child builders, one for each component definition. The latter do all the work for each component definition.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def __init__(self, ast_builder: "AstBuilder"):
    self.diag_store: diagnostics.DiagnosticStore = ast_builder.diag_store
    self.varparam_counter = Counter(100)

    ast_builder.register_new_section(self)
    self.child_builders: List[CompDefChildBuilders] = []
    self.current_component: Optional[CompDefChildBuilders] = None
finish
finish(
    spec: Specification,
    doc_distributor: DocCommentDistributor,
)

Parsing has finished, complete type checking.

Parameters:

Name Type Description Default
spec Specification

Specification already containing types, verb-prepositions, and relation definitions. Must be filled with component definitions and world component.

required
doc_distributor DocCommentDistributor

Object that accepts the found doc comments for distributing them to the elements of the specification.

required
Source code in src/raesl/compile/typechecking/compdef_builder.py
def finish(self, spec: "Specification", doc_distributor: "DocCommentDistributor"):
    """Parsing has finished, complete type checking.

    Arguments:
        spec: Specification already containing types, verb-prepositions, and
            relation definitions. Must be filled with component definitions and
            world component.
        doc_distributor: Object that accepts the found doc comments for distributing
            them to the elements of the specification.
    """
    # Verify that component definitions are sufficiently unique.
    compdefs: Dict[str, List[CompDefChildBuilders]] = collections.defaultdict(list)
    worlds = []
    for ch_builder in self.child_builders:
        if ch_builder.name_tok is None:
            worlds.append(ch_builder)
        else:
            name = ch_builder.name_tok.tok_text
            compdefs[name].append(ch_builder)

    if len(worlds) == 0:
        self.diag_store.add(diagnostics.E202("world definition"))
    elif len(worlds) > 1:
        locs = [world.pos_tok.get_location() for world in worlds]
        self.diag_store.add(
            diagnostics.E200("world", "component instance", location=locs[0], dupes=locs)
        )

    for cdefs in compdefs.values():
        if len(cdefs) > 1:
            # Duplicate component definitions.
            locs = [cdef.pos_tok.get_location() for cdef in cdefs]
            assert cdefs[0].name_tok is not None
            name = cdefs[0].name_tok.tok_text
            self.diag_store.add(
                diagnostics.E200(name, "component definition", location=locs[0], dupes=locs)
            )

    # Components to use.
    comp_builders = [cdef[0] for cdef in compdefs.values()]
    if len(worlds) > 0:
        comp_builders.append(worlds[0])

    # Order component definitions such that they are defined before they are used as
    # child component instances.
    orderer = Orderer()
    for comp_builder in comp_builders:
        child_defnames = comp_builder.get_child_compdefnames()
        if comp_builder.name_tok:
            compdef_name = comp_builder.name_tok.tok_text
        else:
            compdef_name = "world"  # Name cannot exist otherwise.

        orderer.add_dependency(compdef_name, child_defnames, comp_builder)

    resolveds, cycle = orderer.resolve()
    for entry in resolveds:
        if entry.data is None:
            # Entry was created by the orderer, we should run into it again and fail
            # to find it. Note that a component definition is always added to the
            # spec even if it contains errors. Therefore, presence of a component
            # definition in spec means it was present in the ESL text.
            continue

        entry.data.finish(spec, doc_distributor)

    if cycle:
        assert entry.data is not None
        locs = [entry.data.pos_tok.get_location() for entry in cycle]
        assert cycle[0].data is not None
        comp_name = cycle[0].data.name_tok.tok_text  # Cannot be world!
        self.diag_store.add(
            diagnostics.E204(comp_name, "component definition", location=locs[0], cycle=locs)
        )
new_componentdef
new_componentdef(pos_tok: Token, name_tok: Optional[Token])

New component definition started.

Parameters:

Name Type Description Default
pos_tok Token

Token defining the start position of the component definition.

required
name_tok Optional[Token]

Token with the name of the definition if it exists. Non-existing name means the component represents 'world'.

required
Source code in src/raesl/compile/typechecking/compdef_builder.py
def new_componentdef(self, pos_tok: "Token", name_tok: Optional["Token"]):
    """New component definition started.

    Arguments:
        pos_tok: Token defining the start position of the component definition.
        name_tok: Token with the name of the definition if it exists.
            Non-existing name means the component represents 'world'.
    """
    self.current_component = CompDefChildBuilders(
        self, pos_tok, name_tok, self.varparam_counter
    )
    self.child_builders.append(self.current_component)
notify_new_section
notify_new_section(new_top_section)

Notification for self and possibly the child builders if a new component definition is under construction.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def notify_new_section(self, new_top_section):
    """Notification for self and possibly the child builders if a new component
    definition is under construction.
    """
    if new_top_section:
        self.current_component = None

Counter

Counter(first_free_value: int)

Class for handing out unique numeric values.

Normally, a static class variable would do, except testing more than one specification at a time doesn't reset the counter, leading to different output depending on what is being tested together.

Parameters:

Name Type Description Default
first_free_value int

First free value to set the counter to.

required

Attributes:

Name Type Description
counter

Next free value.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def __init__(self, first_free_value: int):
    self.counter = first_free_value
next
next() -> int

Get a unique number from the counter instance.

Source code in src/raesl/compile/typechecking/compdef_builder.py
def next(self) -> int:
    """Get a unique number from the counter instance."""
    value = self.counter
    self.counter = self.counter + 1
    return value

compdef_comment_builder

Deal with the comment sections in a component definition.

CompDefCommentBuilder

CompDefCommentBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Collect the names in the 'comments' section, and hook them into the doc comments distributor.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Component definition's section builders storage.

required

Attributes:

Name Type Description
diag_store

Diagnostics storage of component definition child builders.

name_toks List[Token]

Names occurring in a comment section, collected during parsing.

Source code in src/raesl/compile/typechecking/compdef_comment_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.name_toks: List["Token"] = []
add_comment
add_comment(name_tok: Token)

Parser found a name in a comments section, store it for future processing.

Source code in src/raesl/compile/typechecking/compdef_comment_builder.py
def add_comment(self, name_tok: "Token"):
    """Parser found a name in a comments section, store it for future processing."""
    self.name_toks.append(name_tok)
finish_comp
finish_comp(
    comp_def: ComponentDefinition,
    doc_distributor: DocCommentDistributor,
)

Process all collected names. This method should be the final step in processing a component definition, as it needs all elements that take doc comments.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Component definition to finish.

required
doc_distributor DocCommentDistributor

Object that distributes doc comments to interested elements of the specification.

required
Source code in src/raesl/compile/typechecking/compdef_comment_builder.py
def finish_comp(
    self, comp_def: "ComponentDefinition", doc_distributor: "DocCommentDistributor"
):
    """Process all collected names. This method should be the final step in
    processing a component definition, as it needs all elements that take doc
    comments.

    Arguments:
        comp_def: Component definition to finish.
        doc_distributor: Object that distributes doc comments to interested elements
            of the specification.
    """
    comp_def_doc_elements = get_doc_comment_comp_elements(comp_def)
    available = {}  # Available elements in the specification, ordered by their name.
    for elm in comp_def_doc_elements:
        assert elm.doc_tok is not None
        available[elm.doc_tok.tok_text] = elm

    for name_tok in self.name_toks:
        # Find language elements to point to by their main name only.
        i = name_tok.tok_text.find(".")
        if i < 0:
            main_name = name_tok.tok_text
        else:
            main_name = name_tok.tok_text[:i]

        opt_elm = available.get(main_name)
        if opt_elm is None:
            # Report an error if an element is not available.
            if comp_def.name_tok is None:
                comp_name = "world"
            else:
                comp_name = comp_def.name_tok.tok_text

            # Can't find doc element in component.
            self.diag_store.add(
                diagnostics.E205(
                    f"element '{name_tok.tok_text}'",
                    f"component '{comp_name}'",
                    name_tok.get_location(),
                )
            )

            # Construct a dummy element so any comment after it is caught.
            # No errors to report as the above already reported one.
            doc_distributor.add_dummy_element(name_tok, False)

        else:
            # Add a proxy, redirecting doc comments to the correct element.
            doc_distributor.add_element(comment_storage.ProxyDocStore(name_tok, opt_elm))

compdef_compinst_builder

Builder to add child component instances to a component definition.

CompDefCompInstBuilder

CompDefCompInstBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Collect and check child component instances of a component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builder for a component definition.

required

Attributes:

Name Type Description
diag_store

Child builders problem store.

instances List[ComponentInstance]

Collected component instances.

last_instance Optional[ComponentInstance]

Link to last added instance, to allow adding instance arguments to it.

Source code in src/raesl/compile/typechecking/compdef_compinst_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.instances: List[ComponentInstance] = []
    self.comp_child_builders = comp_child_builders
    self.last_instance: Optional[ComponentInstance] = None
add_compinst
add_compinst(
    inst_name_tok: Token,
    def_name_tok: Token,
    has_arguments: bool,
)

Store a new child component instance line.

Source code in src/raesl/compile/typechecking/compdef_compinst_builder.py
def add_compinst(self, inst_name_tok: "Token", def_name_tok: "Token", has_arguments: bool):
    """Store a new child component instance line."""
    compinst = ComponentInstance(inst_name_tok, def_name_tok)
    self.instances.append(compinst)
    if has_arguments:
        self.last_instance = compinst
    else:
        self.last_instance = None
add_compinst_arguments
add_compinst_arguments(arguments: List[Token])

Store a line of component instance argument names.

Source code in src/raesl/compile/typechecking/compdef_compinst_builder.py
def add_compinst_arguments(self, arguments: List["Token"]):
    """Store a line of component instance argument names."""
    assert self.last_instance is not None
    for argname in arguments:
        self.last_instance.arguments.append(InstanceArgument(argname))
finish_comp
finish_comp(
    comp_def: ComponentDefinition, spec: Specification
)

Finish checking and adding child component instances to the component.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Used as 'my' component definition.

required
spec Specification

Used as source for types.

required
Source code in src/raesl/compile/typechecking/compdef_compinst_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", spec: "Specification"):
    """Finish checking and adding child component instances to the component.

    Arguments:
        comp_def: Used as 'my' component definition.
        spec: Used as source for types.
    """
    # Available names of variables, parameters, and variable-groups in 'my'
    # component.
    avail_vps = utils.construct_var_param_map(comp_def)
    avail_vgroups = utils.construct_vargroup_map(comp_def)

    # The 'other' component definition needed for checking the instance  is
    # already available, as ComponentDefBuilder ordered compdef checking on
    # instance use.
    avail_compdefs = dict(
        (cdef.name_tok.tok_text, cdef) for cdef in spec.comp_defs if cdef.name_tok is not None
    )

    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for inst in self.instances:
        elements_by_label[inst.inst_name_tok.tok_text].append(inst)

    # Collection reported names to avoid double problem reports.
    reported_names: Set[str] = set()

    # Check 'my' instances.
    for inst in self.instances:
        compdef = avail_compdefs.get(inst.def_name_tok.tok_text)
        if compdef is None:
            # Truly undefined 'other' component definition.
            name = inst.def_name_tok.tok_text
            loc = inst.def_name_tok.get_location()
            self.diag_store.add(
                diagnostics.E203("component definition", name=name, location=loc)
            )
            continue  # Cannot do anything else useful with it.

        # Link 'other' component definition to the instance for future use.
        inst.compdef = compdef

        # Found a component definition with the same name, do arguments match as
        # well?
        #
        # 1. Collect argument information of the instance.
        found_error = False
        inst_arguments: List[Optional[Tuple["Token", "Node"]]] = []
        for arg in inst.arguments:
            node: Optional["Node"]
            node = utils.resolve_var_param_group_node(
                arg.name_tok,
                avail_vps,
                avail_vgroups,
                reported_names,
                self.diag_store,
            )

            if node is None:
                # resolve_var_param_group_node() already created an error.

                inst_arguments.append(None)  # Add dummy to check other arguments.
                found_error = True
            else:
                inst_arguments.append((arg.name_tok, node))
                arg.argnode = node

        # Convert parameters of the definition to InputType as well.
        parameters = [(param.name_tok, param.node) for param in compdef.parameters]

        # 2. Check number of parameters against number of arguments.
        inst_length = len(inst_arguments)
        def_length = len(parameters)
        if inst_length != def_length:
            self.diag_store.add(
                diagnostics.E221(
                    "argument",
                    inst_length,
                    def_length,
                    location=inst.inst_name_tok.get_location(),
                    references=[compdef.pos_tok.get_location()],
                )
            )
            found_error = True
            continue

        # 3. Check argument types against parameter types of compdef, skipping None
        #    arguments. We assume undirected flow direction, thus arguments must
        #    accept all possible values that the definition may provide.
        for param_input, arg_input in zip(parameters, inst_arguments):
            if arg_input is None:
                continue  # Already reported as an error.

            diag = check_type(
                subtype=arg_input, supertype=param_input, allow_subtype_limits=False
            )
            if diag:
                self.diag_store.add(diag)
                found_error = True

        # Add inst to compdef if no errors.
        if not found_error:
            comp_def.component_instances.append(inst)
get_compdef_names
get_compdef_names() -> Set[str]

Get the names of used definitions.

Source code in src/raesl/compile/typechecking/compdef_compinst_builder.py
def get_compdef_names(self) -> Set[str]:
    """Get the names of used definitions."""
    def_names = set()
    for inst in self.instances:
        name = inst.def_name_tok.tok_text
        def_names.add(name)

    return def_names

compdef_design_builder

Code for collecting and type checking designs.

CompDefDesignBuilder

CompDefDesignBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Class for collecting and type checking designs in a component definition.

Source code in src/raesl/compile/typechecking/compdef_design_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.comp_child_builders = comp_child_builders
    self.designs: List[components.Design] = []
    self.design_kind: Optional[str] = None
add_design_subclause
add_design_subclause(sub: SubClause)

Subclause of the last design has been found, append it to the last design.

Source code in src/raesl/compile/typechecking/compdef_design_builder.py
def add_design_subclause(self, sub: components.SubClause):
    """Subclause of the last design has been found, append it to the last design."""
    self.designs[-1].sub_clauses.append(sub)
design_line
design_line(design: Design)

New design rule found, store it.

Source code in src/raesl/compile/typechecking/compdef_design_builder.py
def design_line(self, design: components.Design):
    """New design rule found, store it."""
    assert self.design_kind is not None
    design.design_kind = self.design_kind
    self.designs.append(design)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, _spec: Specification
)

Check the found designs in the context of 'comp_def', and add them after verification.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Surrounding component definition supplying variables and parameters. Checked designs should be added to it after checking.

required
_spec Specification

Specification being constructed, source for types and verbs.

required
Source code in src/raesl/compile/typechecking/compdef_design_builder.py
def finish_comp(self, comp_def: components.ComponentDefinition, _spec: "Specification"):
    """Check the found designs in the context of 'comp_def', and add them after
    verification.

    Arguments:
        comp_def: Surrounding component definition supplying variables and
            parameters. Checked designs should be added to it after checking.
        _spec: Specification being constructed, source for types and verbs.
    """
    vps = utils.construct_var_param_map(comp_def)
    expr_checker = ExprChecker(vps, self.diag_store)

    good_designs = []  # Designs that can be added.
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for design in self.designs:
        is_good = True

        # Order by label for double use checking.
        elements_by_label[design.label_tok.tok_text].append(design)

        # Verify comparisons.
        if not expr_checker.check_expr(design.expr):
            is_good = False

        for sub in design.sub_clauses:
            if not expr_checker.check_expr(sub.expr):
                is_good = False

        # Consider design good if no big issues found.
        if is_good:
            good_designs.append(design)

    comp_def.designs = good_designs
new_design_header
new_design_header(kind: Token)

New design section started, store the kind stated in the header.

Source code in src/raesl/compile/typechecking/compdef_design_builder.py
def new_design_header(self, kind: "Token"):
    """New design section started, store the kind stated in the header."""
    if kind.tok_type == "DESIGN_REQUIREMENT_KW":
        self.design_kind = components.REQUIREMENT
    else:
        assert kind.tok_type == "DESIGN_CONSTRAINT_KW"
        self.design_kind = components.CONSTRAINT

compdef_goal_builder

Code for collecting and adding goals to component definitions.

CompDefGoalBuilder

CompDefGoalBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Bases: GoalTransformBaseBuilder

Collect goals of a component from the parser, check them, and eventually add them to the surrounding component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Child builders as retrieved from the parser.

required
Source code in src/raesl/compile/typechecking/compdef_goal_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    super(CompDefGoalBuilder, self).__init__(comp_child_builders.diag_store)
    self.goals: List[components.Goal] = []
    self.comp_child_builders = comp_child_builders
    self.goal_kind: Optional[str] = None
add_goal
add_goal(goal: Goal)

New goal has been found by the parser, add it to the found goals.

Parameters:

Name Type Description Default
goal Goal

Goal to add.

required
Source code in src/raesl/compile/typechecking/compdef_goal_builder.py
def add_goal(self, goal: components.Goal):
    """New goal has been found by the parser, add it to the found goals.

    Arguments:
        goal: Goal to add.
    """
    assert self.goal_kind is not None
    goal.goal_kind = self.goal_kind
    self.goals.append(goal)
add_goal_subclause
add_goal_subclause(sub_clause: SubClause)

Subclause of the last goal has been found, add it to the last goal.

Source code in src/raesl/compile/typechecking/compdef_goal_builder.py
def add_goal_subclause(self, sub_clause: components.SubClause):
    """Subclause of the last goal has been found, add it to the last goal."""
    self.goals[-1].sub_clauses.append(sub_clause)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, spec: Specification
)

Check the found goals, and add them to the component.

Source code in src/raesl/compile/typechecking/compdef_goal_builder.py
def finish_comp(self, comp_def: components.ComponentDefinition, spec: "Specification"):
    """Check the found goals, and add them to the component."""
    vps = utils.construct_var_param_map(comp_def)
    cinsts = utils.construct_comp_instances_map(comp_def)
    vpps = utils.construct_verb_prepos_combis(spec)

    expr_checker = ExprChecker(vps, self.diag_store)

    # Verify all goals in the component.
    good_goals = []  # Goals without fatal error.
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for goal in self.goals:
        is_good = True
        assert goal.goal_kind is not None
        self.check_form("goal", goal.goal_kind, goal.doesaux, goal.sub_clauses)

        # Check existence of active and passive components.
        goal.active_comp = self.resolve_component(goal.active, cinsts)
        goal.passive_comp = self.resolve_component(goal.passive, cinsts)
        if not goal.active_comp or not goal.passive_comp:
            is_good = False

        # Store goal on its label for duplicate label detection.
        elements_by_label[goal.label_tok.tok_text].append(goal)

        if not self.verify_flows(goal.flows, vps):
            is_good = False

        self.verify_verb_prepos(goal.verb, goal.prepos, vpps)

        # Verify subclauses.
        for sub in goal.sub_clauses:
            if not expr_checker.check_expr(sub.expr):
                is_good = False

        if is_good:
            good_goals.append(goal)

    comp_def.goals = good_goals
new_goal_header
new_goal_header(goal_kind: Token)

New goal header line found.

Parameters:

Name Type Description Default
goal_kind Token

Kind of goals that will follow.

required
Source code in src/raesl/compile/typechecking/compdef_goal_builder.py
def new_goal_header(self, goal_kind: "Token"):
    """New goal header line found.

    Arguments:
        goal_kind: Kind of goals that will follow.
    """
    if goal_kind.tok_type == "GOAL_REQUIREMENT_KW":
        self.goal_kind = components.REQUIREMENT
    else:
        assert goal_kind.tok_type == "GOAL_CONSTRAINT_KW"
        self.goal_kind = components.CONSTRAINT

compdef_need_builder

Code for handling 'needs' in component definitions.

CompDefNeedBuilder

CompDefNeedBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Class for handling 'need' sections.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required

Attributes:

Name Type Description
needs List[Need]

Collected needs.

Source code in src/raesl/compile/typechecking/compdef_need_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.comp_child_builders = comp_child_builders

    self.needs: List[Need] = []
add_need
add_need(
    label_tok: Token, subject_tok: Token, description: str
)

Parser found another need, store it for future processing.

Source code in src/raesl/compile/typechecking/compdef_need_builder.py
def add_need(self, label_tok: "Token", subject_tok: "Token", description: str):
    """Parser found another need, store it for future processing."""
    need = Need(label_tok, subject_tok, description)
    self.needs.append(need)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, _spec: Specification
)

Check the collected needs, and store them in the component definition.

Source code in src/raesl/compile/typechecking/compdef_need_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", _spec: "Specification"):
    """Check the collected needs, and store them in the component definition."""
    # Verify non-duplicated need labels.
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for need in self.needs:
        elements_by_label[need.label_tok.tok_text].append(need)

    # Construct link to the definition of the subject mentioned in the neEd.
    # If found, add the need to the component definition, else give an error.
    vps = utils.construct_var_param_map(comp_def)
    cinsts = utils.construct_comp_instances_map(comp_def)
    rgtdbs = utils.construct_relinst_goal_transform_design_behavior_map(comp_def)

    for need in self.needs:
        cinst = cinsts.get(need.subject_tok.tok_text)
        if cinst is not None:
            need.subject = cinst
            comp_def.needs.append(need)
            continue

        rgtdb = rgtdbs.get(need.subject_tok.tok_text)
        if rgtdb is not None:
            # XXX Returned type of rgtdb looks like a subset of allowed subject
            # types. Check!
            need.subject = rgtdb
            comp_def.needs.append(need)
            continue

        varparam = resolve_var_param_node(need.subject_tok, vps, set(), self.diag_store)
        if varparam is not None:
            if isinstance(varparam, ElementaryVarNode):
                need.subject = varparam
                comp_def.needs.append(need)
                continue
            else:
                loc = need.subject_tok.get_location()
                name = need.label_tok.tok_text
                self.diag_store.add(diagnostics.E226(name, location=loc))
                continue

        loc = need.subject_tok.get_location()
        name = need.subject_tok.tok_text
        context = need.label_tok.tok_text
        self.diag_store.add(
            diagnostics.E205(
                f"subject '{name}'",
                f"the context of need '{context}'",
                location=loc,
            )
        )

compdef_relinst_builder

Relation instance type-checking in a component definition.

CompDefRelInstBuilder

CompDefRelInstBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Collect and check relation instances in a component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required

Attributes:

Name Type Description
relinsts List[RelInst]

Collected relation instances in the component.

last_relinst Optional[RelInst]

Link to last added instance to allow adding instance arguments.

Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.relinsts: List[RelInst] = []
    self.comp_child_builders = comp_child_builders
    self.last_relinst: Optional[RelInst] = None
add_relinst_arguments
add_relinst_arguments(name_toks: List[Token])

Parser found an argument of a direction-block in a relation instance, store it for future checking.

Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def add_relinst_arguments(self, name_toks: List["Token"]):
    """Parser found an argument of a direction-block in a relation instance, store
    it for future checking.
    """
    assert self.last_relinst is not None

    self.last_relinst.arg_blocks[-1].arg_name_toks.extend(name_toks)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, spec: Specification
)

Check the collected relation instances, report errors, and add the instances to the given component.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Component definition to extend with the found relation instances. Also a source of available variables, parameters, and variable groups.

required
spec Specification

Specification being constructed. Source for types and relation definitions processed previously.

required
Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", spec: "Specification"):
    """Check the collected relation instances, report errors, and add the
    instances to the given component.

    Arguments:
        comp_def: Component definition to extend with the found relation instances.
            Also a source of available variables, parameters, and variable groups.
        spec: Specification being constructed. Source for types and relation
            definitions processed previously.
    """
    self._add_relation_instance_names()

    # Available names of variables, parameters, and variable-groups in the
    # component.
    avail_vps = utils.construct_var_param_map(comp_def)
    avail_vgroups = utils.construct_vargroup_map(comp_def)

    reported_names: Set[str] = set()  # Names already reported to avoid reporting them again.

    # Make access to relation definitions in the specification easy.
    reldefs = dict((reldef.name.tok_text, reldef) for reldef in spec.rel_defs)

    # Check instances.
    for relinst in self.relinsts:
        # Look for a definition.
        reldef = reldefs.get(relinst.def_name_tok.tok_text)
        if reldef is None:
            # Truly undefined relation definition.
            loc = relinst.def_name_tok.get_location()
            name = relinst.def_name_tok.tok_text
            self.diag_store.add(
                diagnostics.E203("relation definition", name=name, location=loc)
            )
            continue

        def_parameters = CompDefRelInstBuilder._split_definition_parameters_by_kind(reldef)
        instkind_groups: Optional[List[RelInstArgGroup]]
        instkind_groups = self._split_instance_arguments_by_kind(
            relinst, reldef, def_parameters
        )
        if instkind_groups is None:
            continue

        reldef_param_indices = dict((rd_param, i) for i, rd_param in enumerate(reldef.params))
        instance_arguments: List[Optional[List[InstanceArgument]]]
        instance_arguments = [None] * len(
            reldef.params
        )  # Gets filled using 'reldef_param_indices'

        found_error = False
        for group in instkind_groups:
            argkind = group.argkind
            params = group.parameters
            argument_lists = self._group_param_args(
                argkind, params, group.arguments, relinst, reldef
            )
            if argument_lists is None:
                found_error = True
                continue

            # Perform type checking.
            assert len(params) == len(argument_lists)
            for param, args in zip(params, argument_lists):
                relinst_argument = []  # list due to the 'one or more' feature.

                # - For singular-value parameters, args is a list of length 1 and
                #   each value in it must fit in the parameter type.
                # - For plural-value parameters, args may be longer and
                #   each value in it must fit in the parameter type.
                param_input: Tuple[
                    Optional["Token"],
                    Union["BaseType", "VarParam", "VariableGroup"],
                ]
                assert param.type is not None
                param_input = (param.name, param.type)
                for arg in args:
                    # Convert token of the argument to a node.
                    node = utils.resolve_var_param_group_node(
                        arg,
                        avail_vps,
                        avail_vgroups,
                        reported_names,
                        self.diag_store,
                    )
                    if node is None:
                        found_error = True
                        continue

                    # Check type
                    arg_input = (arg, node)
                    type_problem: Optional[diagnostics.EslDiagnostic]
                    type_problem = check_type(
                        subtype=arg_input,
                        supertype=param_input,
                        allow_subtype_limits=(argkind == "requiring"),
                    )
                    if type_problem is not None:
                        self.diag_store.add(type_problem)
                        found_error = True
                        continue

                    relinst_argument.append(InstanceArgument(arg, node))

                instance_arguments[reldef_param_indices[param]] = relinst_argument

        if not found_error:
            assert all(ia is not None for ia in instance_arguments)
            instance = RelationInstance(
                relinst.inst_name_tok,
                relinst.def_name_tok,
                cast(List[List[InstanceArgument]], instance_arguments),
                reldef,
            )
            comp_def.relations.append(instance)
new_relinst
new_relinst(inst_name_tok: Token, def_name_tok: Token)

Parser found a new relation instance, store it for future extending by the parser, and eventual type checking and adding to the surrounding component definition.

Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def new_relinst(self, inst_name_tok: "Token", def_name_tok: "Token"):
    """Parser found a new relation instance, store it for future extending by
    the parser, and eventual type checking and adding to the surrounding
    component definition.
    """
    relinst = RelInst(inst_name_tok, def_name_tok)
    self.relinsts.append(relinst)
    self.last_relinst = relinst
relinst_argheader
relinst_argheader(argkind_tok: Token)

Parser found a new direction block for a relation instance, collect it.

Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def relinst_argheader(self, argkind_tok: "Token"):
    """Parser found a new direction block for a relation instance, collect it."""
    assert self.last_relinst is not None

    arg_block = RelInstArgsBlock(argkind_tok)
    self.last_relinst.arg_blocks.append(arg_block)

RelInst

RelInst(
    inst_name_tok: Token,
    def_name_tok: Token,
    arg_blocks: Optional[List[RelInstArgsBlock]] = None,
)

Relation instance while collecting.

Parameters:

Name Type Description Default
inst_name_tok Token

Instance name.

required
def_name_tok Token

Definition name.

required
arg_blocks Optional[List[RelInstArgsBlock]]

Blocks with arguments.

None
Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def __init__(
    self,
    inst_name_tok: "Token",
    def_name_tok: "Token",
    arg_blocks: Optional[List[RelInstArgsBlock]] = None,
):
    self.inst_name_tok = inst_name_tok
    self.def_name_tok = def_name_tok

    self.arg_blocks: List[RelInstArgsBlock]
    if arg_blocks is None:
        self.arg_blocks = []
    else:
        self.arg_blocks = arg_blocks

RelInstArgGroup

RelInstArgGroup(
    argkind: str,
    parameters: List[RelationDefParameter],
    arguments: List[Token],
)

Temporary data storage of parameters and arguments of a single kind.

Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def __init__(
    self,
    argkind: str,
    parameters: List[relations.RelationDefParameter],
    arguments: List["Token"],
):
    self.argkind = argkind
    self.parameters = parameters
    self.arguments = arguments

RelInstArgsBlock

RelInstArgsBlock(
    argkind_tok: Token,
    arg_name_toks: Optional[List[Token]] = None,
)

A 'block' of arguments of a relation instance. The kind of arguments, and a list of argument lines.

Parameters:

Name Type Description Default
argkind_tok Token

Token indicating the kind of arguments specified in the block.

required
arg_name_toks Optional[List[Token]]

Arguments of the block.

None
Source code in src/raesl/compile/typechecking/compdef_relinst_builder.py
def __init__(self, argkind_tok: "Token", arg_name_toks: Optional[List["Token"]] = None):
    self.argkind_tok = argkind_tok

    self.arg_name_toks: List["Token"]
    if arg_name_toks is None:
        self.arg_name_toks = []
    else:
        self.arg_name_toks = arg_name_toks

compdef_transform_builder

Code for collecting and type checking of transformations.

CompDefTransformBuilder

CompDefTransformBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Bases: GoalTransformBaseBuilder

Collect transformations of a component from the parser, check them, and add them to the component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required

Attributes:

Name Type Description
transforms List[Transformation]

Collected transformations.

transform_kind Optional[str]

Last found kind of transformation kind, either 'requirement' or 'constraint'.

Source code in src/raesl/compile/typechecking/compdef_transform_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    super(CompDefTransformBuilder, self).__init__(comp_child_builders.diag_store)
    self.transforms: List[components.Transformation] = []
    self.comp_child_builders = comp_child_builders
    self.transform_kind: Optional[str] = None
add_transform
add_transform(transform: Transformation)

A new transformation has been found, add it to the collection.

Parameters:

Name Type Description Default
transform Transformation

Transformation to add.

required
Source code in src/raesl/compile/typechecking/compdef_transform_builder.py
def add_transform(self, transform: components.Transformation):
    """A new transformation has been found, add it to the collection.

    Arguments:
        transform: Transformation to add.
    """
    assert self.transform_kind is not None
    transform.transform_kind = self.transform_kind
    self.transforms.append(transform)
add_transform_subclause
add_transform_subclause(sub_clause: SubClause)

Add a found subclause that belongs to the last transformation.

Source code in src/raesl/compile/typechecking/compdef_transform_builder.py
def add_transform_subclause(self, sub_clause: "SubClause"):
    """Add a found subclause that belongs to the last transformation."""
    self.transforms[-1].sub_clauses.append(sub_clause)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, spec: Specification
)

Check the found transformations, and add them to the component.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Component definition to extend with the found relation instances. Also a source of available variables, parameters, and variable groups.

required
spec Specification

Specification being constructed. Source for types and relation definitions processed previously.

required
Source code in src/raesl/compile/typechecking/compdef_transform_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", spec: "Specification"):
    """Check the found transformations, and add them to the component.

    Arguments:
        comp_def: Component definition to extend with the found relation instances.
            Also a source of available variables, parameters, and variable groups.
        spec: Specification being constructed. Source for types and relation
            definitions processed previously.
    """
    vps = utils.construct_var_param_map(comp_def)
    vpps = utils.construct_verb_prepos_combis(spec)

    expr_checker = ExprChecker(vps, self.diag_store)

    # Verify all transformations in the component.
    good_transforms = []  # Transformations to add to the component.
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label
    for trans in self.transforms:
        is_good = True
        assert trans.transform_kind is not None
        self.check_form(
            "transformation",
            trans.transform_kind,
            trans.doesaux_tok,
            trans.sub_clauses,
        )

        # Store transformation on its label for duplicate label detection.
        elements_by_label[trans.label_tok.tok_text].append(trans)

        if not self.verify_flows(trans.in_flows, vps):
            is_good = False
        if not self.verify_flows(trans.out_flows, vps):
            is_good = False

        self.verify_verb_prepos(trans.verb_tok, trans.prepos_tok, vpps)

        # Verify subclauses.
        for sub in trans.sub_clauses:
            if not expr_checker.check_expr(sub.expr):
                is_good = False

        if is_good:
            good_transforms.append(trans)

    comp_def.transforms = good_transforms
new_transform_header
new_transform_header(transform_kind: Token)

New transform section line found.

Parameters:

Name Type Description Default
transform_kind Token

Kind of transformations that will follow.

required
Source code in src/raesl/compile/typechecking/compdef_transform_builder.py
def new_transform_header(self, transform_kind: "Token"):
    """New transform section line found.

    Arguments:
        transform_kind: Kind of transformations that will follow.
    """
    if transform_kind.tok_type == "TRANSFORM_REQUIREMENT_KW":
        self.transform_kind = components.REQUIREMENT
    else:
        assert transform_kind.tok_type == "TRANSFORM_CONSTRAINT_KW"
        self.transform_kind = components.CONSTRAINT

compdef_vargroup_builder

Variable groups in a component.

CollectedVarGroup

CollectedVarGroup(
    name_tok: Token, child_name_toks: List[Token]
)

Temporary data storage of found variable groups.

Source code in src/raesl/compile/typechecking/compdef_vargroup_builder.py
def __init__(self, name_tok: "Token", child_name_toks: List["Token"]):
    self.name_tok = name_tok
    self.child_name_toks = child_name_toks

CompDefVarGroupBuilder

CompDefVarGroupBuilder(
    comp_child_builders: CompDefChildBuilders,
)

Collect and check variable groups of a component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required

Attributes:

Name Type Description
collected_var_groups List[CollectedVarGroup]

Collected var group instances in the component.

last_vgroup Optional[CollectedVarGroup]

Link to last added instance to allow adding instance arguments.

Source code in src/raesl/compile/typechecking/compdef_vargroup_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders"):
    self.diag_store = comp_child_builders.diag_store
    self.collected_var_groups: List[CollectedVarGroup] = []
    self.comp_child_builders = comp_child_builders
    self.last_vgroup: Optional[CollectedVarGroup] = None
finish_comp
finish_comp(
    comp_def: ComponentDefinition, _spec: Specification
)

Check the collected variable groups, report errors, and add the instances to the given component.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Component definition to extend with the found variable groups. Also a source of available variables and parameters.

required
_spec Specification

Specification being constructed. Source for types and relation definitions processed previously.

required
Source code in src/raesl/compile/typechecking/compdef_vargroup_builder.py
def finish_comp(self, comp_def: "ComponentDefinition", _spec: "Specification"):
    """Check the collected variable groups, report errors, and add the
    instances to the given component.

    Arguments:
        comp_def: Component definition to extend with the found variable groups.
            Also a source of available variables and parameters.
        _spec: Specification being constructed. Source for types and relation
            definitions processed previously.
    """
    self.last_vgroup = None

    avail_varsparams = utils.construct_var_param_map(comp_def)

    # Check for duplicates and conflicts with variables and parameters.
    elements_by_label: Dict[str, List[Any]] = self.comp_child_builders.elements_by_label

    groups_by_label: Dict[str, List[CollectedVarGroup]] = defaultdict(list)

    for cgroup in self.collected_var_groups:
        elements_by_label[cgroup.name_tok.tok_text].append(cgroup)
        groups_by_label[cgroup.name_tok.tok_text].append(cgroup)
    for name, vgrps in groups_by_label.items():
        varparam = avail_varsparams.get(name)
        if varparam is not None:
            vp_loc = varparam.name_tok.get_location()
            vg_locs = [vgrp.name_tok.get_location() for vgrp in vgrps]
            kind = {True: "variable", False: "parameter"}[varparam.is_variable]
            self.diag_store.add(
                diagnostics.E209(name, kind, "variable group", location=vp_loc, others=vg_locs)
            )

    # Build a dependency graph for the variable groups (as groups may include other
    # groups).
    orderer = Orderer()
    for cgroup in self.collected_var_groups:
        # Build a set required variables, parameters, and variable groups, using
        # non-dotted prefixes.
        needs = set(get_first_namepart(vtok.tok_text) for vtok in cgroup.child_name_toks)
        orderer.add_dependency(cgroup.name_tok.tok_text, needs, cgroup)

    reported: Set[str] = set()  # Reported failures
    vargroups: Dict[str, VariableGroup] = {}  # Resolved variable groups.

    resolved, cycle = orderer.resolve()
    for entry in resolved:
        cgroup = entry.data
        if cgroup is None:
            # Need entry added by the orderer, ignore.
            continue

        # Next collected group to deal with.
        #
        # Each of its variablepart_names either contains non-dotted previous
        # variable groups, or possibly dotted variables or parameters.
        content_vgroup: List[Node] = []  # Result content for the future variable group.
        for partname_tok in cgroup.child_name_toks:
            node = utils.resolve_var_param_group_node(
                partname_tok,
                avail_varsparams,
                vargroups,
                reported,
                self.diag_store,
            )
            if node is not None:
                content_vgroup.append(node)
            # Else, error already reported, ignore it.

        # Even with errors a variable group is constructed to avoid follow-up
        # errors on non-existing groups.
        vgroup = VariableGroup(cgroup.name_tok, cgroup.child_name_toks)
        vgroup.node = GroupNode(cgroup.name_tok, content_vgroup)
        vargroups[cgroup.name_tok.tok_text] = vgroup
        comp_def.var_groups.append(vgroup)

    if cycle:
        locs = [entry.data.name_tok.get_location() for entry in cycle]
        name = cycle[0].data.name_tok.tok_text
        self.diag_store.add(
            diagnostics.E204(name, "variable group", location=locs[0], cycle=locs)
        )
new_vargroup
new_vargroup(name_tok: Token)

Parser found the start of a new variable group definition. Create a new group for it.

Parameters:

Name Type Description Default
name_tok Token

Name of the new variable group.

required
Source code in src/raesl/compile/typechecking/compdef_vargroup_builder.py
def new_vargroup(self, name_tok: "Token"):
    """Parser found the start of a new variable group definition. Create a new group
    for it.

    Arguments:
        name_tok: Name of the new variable group.
    """
    self.last_vgroup = CollectedVarGroup(name_tok, [])
    self.collected_var_groups.append(self.last_vgroup)
vgroup_add_vars
vgroup_add_vars(varpart_name_toks: List[Token])

Parser found a line with variables that are part of the group. Store them for further processing afterwards.

Parameters:

Name Type Description Default
varpart_name_toks List[Token]

Name of variable parts that should become included in the last defined variable group.

required
Source code in src/raesl/compile/typechecking/compdef_vargroup_builder.py
def vgroup_add_vars(self, varpart_name_toks: List["Token"]):
    """Parser found a line with variables that are part of the group. Store them for
    further processing afterwards.

    Arguments:
        varpart_name_toks: Name of variable parts that should become included in
            the last defined variable group.
    """
    assert self.last_vgroup is not None
    self.last_vgroup.child_name_toks.extend(varpart_name_toks)

compdef_varparam_builder

Variables and components in a component definition.

CompDefVarParamBuilder

CompDefVarParamBuilder(
    comp_child_builders: CompDefChildBuilders,
    varparam_counter: Counter,
)

Collect and check variables of a component definition.

Parameters:

Name Type Description Default
comp_child_builders CompDefChildBuilders

Storage of child builders for a component definition.

required
varparam_counter Counter

Object for handing out unique numbers to elementary var/param nodes.

required

Attributes:

Name Type Description
variables List[VarParam]

Variables of the component.

parameters List[VarParam]

Parameters of the component.

Source code in src/raesl/compile/typechecking/compdef_varparam_builder.py
def __init__(self, comp_child_builders: "CompDefChildBuilders", varparam_counter: "Counter"):
    self.diag_store: diagnostics.DiagnosticStore = comp_child_builders.diag_store
    self.varparam_counter = varparam_counter
    self.comp_child_builders = comp_child_builders

    self.variables: List[components.VarParam] = []
    self.parameters: List[components.VarParam] = []
add_parameters
add_parameters(new_params: List[VarParam])

Add parameters of the component definition to the collection.

Source code in src/raesl/compile/typechecking/compdef_varparam_builder.py
def add_parameters(self, new_params: List[components.VarParam]):
    """Add parameters of the component definition to the collection."""
    self.parameters.extend(new_params)
add_variables
add_variables(new_vars: List[VarParam])

Add variables of the component definition to the collection.

Source code in src/raesl/compile/typechecking/compdef_varparam_builder.py
def add_variables(self, new_vars: List[components.VarParam]):
    """Add variables of the component definition to the collection."""
    self.variables.extend(new_vars)
finish_comp
finish_comp(
    comp_def: ComponentDefinition, spec: Specification
)

Check the collected variables and parameters, report errors, and add the instances to the given component.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Component definition to extend with the found variables and parameters.

required
spec Specification

Specification being constructed. Source for types, verbs and relation definitions processed previously.

required
Source code in src/raesl/compile/typechecking/compdef_varparam_builder.py
def finish_comp(self, comp_def: components.ComponentDefinition, spec: "Specification"):
    """Check the collected variables and parameters, report errors, and add the
    instances to the given component.

    Arguments:
        comp_def: Component definition to extend with the found variables and
            parameters.
        spec: Specification being constructed. Source for types, verbs and relation
            definitions processed previously.
    """
    varsparams_by_name: Dict[str, List[Any]]
    varsparams_by_name = self.comp_child_builders.elements_by_label
    for var in self.variables:
        name = var.name_tok.tok_text
        varsparams_by_name[name].append(var)
    for param in self.parameters:
        name = param.name_tok.tok_text
        varsparams_by_name[name].append(param)

    for name, varsparams in varsparams_by_name.items():
        # Verify properties of the variable or parameter.
        varparam = varsparams[0]
        typename = varparam.type_tok.tok_text
        vartypedef = spec.types.get(typename)
        if vartypedef is None:
            loc = varparam.name_tok.get_location()
            self.diag_store.add(diagnostics.E203("type", name=typename, location=loc))
        else:
            # Build node tree for the variable or parameter.
            varparam.node = _make_varnodes(
                varparam.name_tok, vartypedef.type, self.varparam_counter
            )

            # And store it at the right spot in the component definition.
            if varparam.is_variable:
                comp_def.variables.append(varparam)
            else:
                comp_def.parameters.append(varparam)

expr_checker

Check that expressions comply with the language requirements.

ExprChecker

ExprChecker(
    vps: Dict[str, Union[VarParam]],
    diag_store: DiagnosticStore,
)

Class for checking expressions.

Parameters:

Name Type Description Default
vps Dict[str, Union[VarParam]]

Variables and parameters defined in the expression context.

required
diag_store DiagnosticStore

Storage for found diagnostics.

required

Attributes:

Name Type Description
reported_names Set[str]

Names of variables and variable parts that have been reported as error, to avoid duplicate error messages.

Source code in src/raesl/compile/typechecking/expr_checker.py
def __init__(
    self,
    vps: Dict[str, Union[components.VarParam]],
    diag_store: diagnostics.DiagnosticStore,
):
    self.vps = vps
    self.diag_store = diag_store
    self.reported_names: Set[str] = set()
check_expr
check_expr(expr: Expression) -> bool

Check whether the expression follows all rules of the ESL language.

Parameters:

Name Type Description Default
expr Expression

Expression to check.

required

Returns:

Type Description
bool

Whether the expression is considered to be sufficiently correct to continue checking other parts of the construct around the expression.

Source code in src/raesl/compile/typechecking/expr_checker.py
def check_expr(self, expr: "Expression") -> bool:
    """Check whether the expression follows all rules of the ESL language.

    Arguments:
        expr: Expression to check.

    Returns:
        Whether the expression is considered to be sufficiently correct to continue
            checking other parts of the construct around the expression.
    """
    is_ok = True
    notdone = [expr]
    while notdone:
        expr = notdone.pop()

        if isinstance(expr, Disjunction):
            # Break the disjunction down to checking its set of child expressions.
            notdone.extend(expr.childs)
            continue

        elif isinstance(expr, RelationComparison):
            if not self.check_relation_comparison(expr):
                is_ok = False

            continue

        elif isinstance(expr, ObjectiveComparison):
            if not self._check_variable(expr.lhs_var):
                is_ok = False
            continue

        assert False, "Unexpected expression '{}' found.".format(repr(expr))

    return is_ok
check_relation_comparison
check_relation_comparison(expr: RelationComparison) -> bool

Check the provided relation comparison.

Parameters:

Name Type Description Default
expr RelationComparison

Relation comparison to check.

required

Returns:

Type Description
bool

Whether the expression is considered to be sufficiently correct.

Source code in src/raesl/compile/typechecking/expr_checker.py
def check_relation_comparison(self, expr: RelationComparison) -> bool:
    """Check the provided relation comparison.

    Arguments:
        expr: Relation comparison to check.

    Returns:
        Whether the expression is considered to be sufficiently correct.
    """
    # Check both sides.
    left_side = self._check_relation_side(expr.lhs_var)
    right_side = self._check_relation_side(expr.rhs_varval)

    if left_side is None or right_side is None:
        return False

    lhs_pos, lhs_type, lhs_units = left_side
    rhs_pos, rhs_type, rhs_units = right_side

    assert lhs_units is None or len(lhs_units) > 0, "lhs var {} has wrong units '{}'".format(
        expr.lhs_var, lhs_units
    )
    assert rhs_units is None or len(rhs_units) > 0, "rhs varval {} has wrong units '{}'".format(
        expr.rhs_varval, rhs_units
    )

    if isinstance(lhs_type, Compound) or isinstance(
        rhs_type, Compound
    ):  # or rhs_type is Compound:
        self.diag_store.add(
            diagnostics.E228(
                lhs_pos.get_location(),
                rhs_pos.get_location(),
                "contains one or more bundles",
            )
        )

    # Check type compatibility.
    if lhs_type and rhs_type:
        # The lhs and rhs must be compatible in left -> right or in
        # right -> left direction. Additional subtype limits isn't a
        # problem, although it may result in a comparison that can
        # never hold.
        lhs_input = (lhs_pos, lhs_type)
        rhs_input = (rhs_pos, rhs_type)
        diagnostic = check_type(
            supertype=lhs_input, subtype=rhs_input, allow_subtype_limits=True
        )
        if diagnostic:
            diagnostic = check_type(
                supertype=rhs_input, subtype=lhs_input, allow_subtype_limits=True
            )

            if diagnostic:
                # A problem in both direction, lhs an rhs cannot be compared.
                self.diag_store.add(
                    diagnostics.E210(
                        lhs_pos.get_location(),
                        rhs_pos.get_location(),
                        "are not compatible",
                    )
                )

    # Both sets empty is ok ([-] vs [-])
    if lhs_units or rhs_units:
        # Or if at least one unit is listed anywhere, it must have a common unit.
        lhs_units = set() if lhs_units is None else lhs_units
        rhs_units = set() if rhs_units is None else rhs_units
        if not lhs_units.intersection(rhs_units):
            lhs_loc = lhs_pos.get_location()
            rhs_loc = rhs_pos.get_location()
            self.diag_store.add(
                diagnostics.E210(lhs_loc, rhs_loc, reason="have no shared unit")
            )

    return True

goal_transform_base

Base class for goal and transformation processing to improve code sharing.

GoalTransformBaseBuilder

GoalTransformBaseBuilder(diag_store: DiagnosticStore)

Common base class for checking goals and transformations.

Parameters:

Name Type Description Default
diag_store DiagnosticStore

Storage for found diagnostics.

required

Attributes:

Name Type Description
reported_names Set[str]

Names of flows with a reported error, to avoid duplicate error generation.

Source code in src/raesl/compile/typechecking/goal_transform_base.py
def __init__(self, diag_store: diagnostics.DiagnosticStore):
    self.diag_store = diag_store
    self.reported_names: Set[str] = set()
check_form
check_form(
    sect_name: str,
    kind: str,
    doesaux: Token,
    sub_clauses: List[SubClause],
)

Check whether the requirement or constraint form of the text is correct with respect to the containing section.

Parameters:

Name Type Description Default
sect_name str

Name of the section (goal or transformation).

required
kind str

Kind of section containing the text (requirement or constraint).

required
doesaux Token

Token in the formulation that is either 'does' or one of the auxiliary verbs.

required
sub_clauses List[SubClause]

Sub clauses belong to the requirement or constraint.

required
Source code in src/raesl/compile/typechecking/goal_transform_base.py
def check_form(
    self,
    sect_name: str,
    kind: str,
    doesaux: "Token",
    sub_clauses: List["SubClause"],
):
    """Check whether the requirement or constraint form of the text is correct with
    respect to the containing section.

    Arguments:
        sect_name: Name of the section (goal or transformation).
        kind: Kind of section containing the text (requirement or constraint).
        doesaux: Token in the formulation that is either 'does' or one of the
            auxiliary verbs.
        sub_clauses: Sub clauses belong to the requirement or constraint.
    """
    # None of the possible diagnostics is considered to be fatal, so nothing is
    # returned.

    # Verify uniqueness of subclause labels.
    subclauses_by_label: Dict[str, List["SubClause"]] = defaultdict(list)
    for sub in sub_clauses:
        subclauses_by_label[sub.label_tok.tok_text].append(sub)
    for clauses in subclauses_by_label.values():
        if len(clauses) > 1:
            locs = [sub.label_tok.get_location() for sub in clauses]
            name = clauses[0].label_tok.tok_text
            self.diag_store.add(
                diagnostics.E200(name, "subclause label", location=locs[0], dupes=locs)
            )

    # Check kind-specific requirements.
    if kind == components.CONSTRAINT:
        if doesaux.tok_type != "DOES_KW":
            loc = doesaux.get_location()
            self.diag_store.add(
                diagnostics.E212(
                    "constraint",
                    doesaux.tok_text,
                    "'does'",
                    name=sect_name,
                    location=loc,
                )
            )

        for sub in sub_clauses:
            self._check_constraint_expr(sub.expr)

    else:
        assert kind == components.REQUIREMENT
        if doesaux.tok_type == "DOES_KW":
            loc = doesaux.get_location()
            self.diag_store.add(
                diagnostics.E212(
                    "requirement",
                    doesaux.tok_text,
                    "one of 'must', 'shall', 'should', 'could', or 'won't'",
                    name=sect_name,
                    location=loc,
                )
            )

        for sub in sub_clauses:
            self._check_requirement_expr(sub.expr)
resolve_component
resolve_component(
    compinst_tok: Token,
    cinsts: Dict[str, ComponentInstance],
) -> Optional[ComponentInstance]

Find a component instance with the provided instance name. If it exists, return it, else report an error and return None, indicating failure.

Source code in src/raesl/compile/typechecking/goal_transform_base.py
def resolve_component(
    self, compinst_tok: "Token", cinsts: Dict[str, "ComponentInstance"]
) -> Optional["ComponentInstance"]:
    """Find a component instance with the provided instance name. If it exists,
    return it, else report an error and return None, indicating failure.
    """
    compinst = cinsts.get(compinst_tok.tok_text)
    if compinst is None:
        loc = compinst_tok.get_location()
        self.diag_store.add(
            diagnostics.E203("component instance", name=compinst_tok.tok_text, location=loc)
        )

    return compinst
verify_flows
verify_flows(
    flows: List[Flow], vps: Dict[str, VarParam]
) -> bool

Check that each flow exists as variable or parameter. Update the link in the Flow object to point to the matching variable or parameter.

Parameters:

Name Type Description Default
flows List[Flow]

Flows to check.

required
vps Dict[str, VarParam]

Available variables and parameters in the component.

required

Returns:

Type Description
bool

Whether all flows can be matched to a variable or parameter.

Source code in src/raesl/compile/typechecking/goal_transform_base.py
def verify_flows(self, flows: List["Flow"], vps: Dict[str, "VarParam"]) -> bool:
    """Check that each flow exists as variable or parameter. Update the link
    in the Flow object to point to the matching variable or parameter.

    Arguments:
        flows: Flows to check.
        vps: Available variables and parameters in the component.

    Returns:
        Whether all flows can be matched to a variable or parameter.
    """
    is_good = True
    for flow in flows:
        node = resolve_var_param_node(flow.name_tok, vps, self.reported_names, self.diag_store)
        if node is None:
            is_good = False
        else:
            flow.flow_node = node

    return is_good
verify_verb_prepos
verify_verb_prepos(
    verb_tok: Token,
    prepos_tok: Token,
    vpps: Set[Tuple[str, str]],
)

Verify verb and pre-position and report an error if the combination does not exist.

Parameters:

Name Type Description Default
verb_tok Token

Token holding the verb text.

required
prepos_tok Token

Token holding the prepos text.

required
vpps Set[Tuple[str, str]]

Available combinations of verbs and prepositions.

required
Source code in src/raesl/compile/typechecking/goal_transform_base.py
def verify_verb_prepos(
    self, verb_tok: "Token", prepos_tok: "Token", vpps: Set[Tuple[str, str]]
):
    """Verify verb and pre-position and report an error if the combination does not
    exist.

    Arguments:
        verb_tok: Token holding the verb text.
        prepos_tok: Token holding the prepos text.
        vpps: Available combinations of verbs and prepositions.
    """
    vpp = (verb_tok.tok_text, prepos_tok.tok_text)
    if vpp not in vpps:
        loc = verb_tok.get_location()
        self.diag_store.add(diagnostics.E211(vpp[0], vpp[1], location=loc))

orderer

Generic class for resolving an order of checking 'things' assuming each thing has a name, and knows the names of its direct dependencies.

Orderer

Orderer()

Class that orders entries so their 'needed_by' entries are processed first.

Attributes:

Name Type Description
_independent_entries Dict[str, OrdererEntry]

Entries that do not depend on other entries.

_dependent_entries Dict[str, OrdererEntry]

Entries that depend on other entries.

Source code in src/raesl/compile/typechecking/orderer.py
def __init__(self) -> None:
    self._independent_entries: Dict[str, OrdererEntry] = {}
    self._dependent_entries: Dict[str, OrdererEntry] = {}
add_dependency
add_dependency(
    provide: str,
    needs: Iterable[str],
    data: TypeOfData = None,
)

Add a dependency where the 'provide' name depends on the 'needs' names.

Parameters:

Name Type Description Default
provide str

Name of the 'thing' that this dependency provides.

required
needs Iterable[str]

Names of 'things' that are required by the provide 'thing'.

required
data TypeOfData

Optional data associated with 'provide'. At most one such data item should exist, class cannot handle multiple data items.

None
Source code in src/raesl/compile/typechecking/orderer.py
def add_dependency(self, provide: str, needs: Iterable[str], data: TypeOfData = None):
    """Add a dependency where the 'provide' name depends on the 'needs' names.

    Arguments:
        provide: Name of the 'thing' that this dependency provides.
        needs: Names of 'things' that are required by the provide 'thing'.
        data: Optional data associated with 'provide'. At most one such data item
            should exist, class cannot handle multiple data items.
    """
    provide_entry = self._find_create_entry(provide, data)
    provide_independent = len(provide_entry.depend_on) == 0

    # Add needs as 'things' it depends on.
    for need in needs:
        need_entry = self._find_create_entry(need)
        need_entry.needed_by.append(provide_entry)
        provide_entry.depend_on.append(need_entry)

    # If 'provide_entry' became dependent, move it.
    if provide_independent and len(provide_entry.depend_on) > 0:
        del self._independent_entries[provide]
        self._dependent_entries[provide] = provide_entry
find_entry
find_entry(name: str) -> Optional[OrdererEntry]

Try to find an entry with the provided name. Useful for duplicate name detection.

Returns:

Type Description
Optional[OrdererEntry]

The found entry (treat as read-only), or None.

Source code in src/raesl/compile/typechecking/orderer.py
def find_entry(self, name: str) -> Optional[OrdererEntry]:
    """Try to find an entry with the provided name. Useful for duplicate name
    detection.

    Returns:
        The found entry (treat as read-only), or None.
    """
    entry = self._independent_entries.get(name)
    if entry is not None:
        entry = self._dependent_entries.get(name)
    return entry
resolve
resolve() -> (
    Tuple[List[OrdererEntry], Optional[List[OrdererEntry]]]
)

Resolve the dependency chain by peeling away independent entries. This should cause some dependent entries to become independent, allowing them to be peeled as well.

Returns:

Type Description
Tuple[List[OrdererEntry], Optional[List[OrdererEntry]]]

Ordered sequence of entries containing name and associated data, and optionally a cycle of entries if at least one cycle exists. Note that the Orderer returns one arbitrary cycle in such a case.

Source code in src/raesl/compile/typechecking/orderer.py
def resolve(self) -> Tuple[List[OrdererEntry], Optional[List[OrdererEntry]]]:
    """Resolve the dependency chain by peeling away independent entries. This should
    cause some dependent entries to become independent, allowing them to be peeled
    as well.

    Returns:
        Ordered sequence of entries containing name and associated data, and
            optionally a cycle of entries if at least one cycle exists. Note that
            the Orderer returns one arbitrary cycle in such a case.
    """
    ordered = []
    while len(self._independent_entries) > 0:
        entry = self._independent_entries.popitem()[1]
        assert len(entry.depend_on) == 0
        ordered.append(entry)

        # Peeling done, update dependent entries.
        while len(entry.needed_by) > 0:
            dependent = entry.needed_by.pop()
            dependent.depend_on.remove(entry)
            if len(dependent.depend_on) == 0:
                # Dependent entry became independent!
                del self._dependent_entries[dependent.name]
                self._independent_entries[dependent.name] = dependent

    if len(self._dependent_entries) == 0:
        return ordered, None

    # We have at least one cycle, find one.
    def find_cycle(
        stack: List[OrdererEntry], entry_indices: Dict[OrdererEntry, int]
    ) -> Optional[List[OrdererEntry]]:
        entry = stack[-1]
        for dep in entry.needed_by:
            if dep in entry_indices:
                # Bingo!
                return stack[entry_indices[dep] :]
            entry_indices[dep] = len(stack)
            stack.append(dep)
            cycle = find_cycle(stack, entry_indices)
            if cycle is not None:
                return cycle
            stack.pop()
            del entry_indices[dep]
        return None

    entry = next(iter(self._dependent_entries.values()))
    stack = [entry]
    entry_indices = {entry: 0}
    cycle = find_cycle(stack, entry_indices)
    assert cycle is not None
    return ordered, cycle

OrdererEntry

OrdererEntry(name: str, data: TypeOfData)

Entry in the orderer.

Parameters:

Name Type Description Default
name str

Name of the entry.

required
data TypeOfData

Data associated with the entry.

required

Attributes:

Name Type Description
needed_by List[OrdererEntry]

List of entries that depend on this entry. May only be accessed by the Orderer.

depend_on List[OrdererEntry]

List of entries that need this entry. May only be accessed by the Orderer.

Source code in src/raesl/compile/typechecking/orderer.py
def __init__(self, name: str, data: TypeOfData) -> None:
    self.name = name
    self.data = data

    # Internal data.
    self.needed_by: List[OrdererEntry] = []
    self.depend_on: List[OrdererEntry] = []

reldef_builder

Collect and process relation definition that are found by the parser.

RelationDefBuilder

RelationDefBuilder(ast_builder: AstBuilder)

Builder to construct relation definitions.

Parameters:

Name Type Description Default
ast_builder AstBuilder

AST builder instance.

required

Attributes:

Name Type Description
diag_store

Storage for found diagnostics.

rel_defs Optional[List[RelationDefinition]]

Created relation definitions while collecting data from parsing.

current_reldef Optional[RelationDefinition]

Reference to the entry in 'rel_defs' that is being filled.

last_occurrences Dict[str, Token]

Map of input/output directions to token of last occurrence of that direction in the current definition.

current_direction Optional[str]

Parameter direction to attach to a new parameter.

Source code in src/raesl/compile/typechecking/reldef_builder.py
def __init__(self, ast_builder: "AstBuilder"):
    self.diag_store = ast_builder.diag_store

    self.rel_defs: Optional[List[relations.RelationDefinition]] = []
    self.current_reldef: Optional[relations.RelationDefinition] = None
    self.last_occurrences: Dict[str, "Token"] = {}
    self.current_direction: Optional[str] = None

    ast_builder.register_new_section(self)
add_reldef
add_reldef(name: Token)

Add a new relation definition. Parameters will follow.

Parameters:

Name Type Description Default
name Token

Name of the relation definition.

required
Source code in src/raesl/compile/typechecking/reldef_builder.py
def add_reldef(self, name: "Token"):
    """Add a new relation definition. Parameters will follow.

    Arguments:
        name: Name of the relation definition.
    """
    assert self.rel_defs is not None

    reldef = relations.RelationDefinition(name)
    self.rel_defs.append(reldef)
    self.current_reldef = reldef
    self.last_occurrences = {}
    self.current_direction = None
finish
finish(spec: Specification)

Check the relation definitions and add them to the result specification.

Source code in src/raesl/compile/typechecking/reldef_builder.py
def finish(self, spec: "Specification"):
    """Check the relation definitions and add them to the result specification."""
    reldef_texts: Dict[
        str, "Token"
    ] = {}  # Map of defined names for relation definitions to their token.
    assert self.rel_defs is not None
    for rel_def in self.rel_defs:
        # Verify unique name.
        reldef_text = rel_def.name.tok_text
        if reldef_text in reldef_texts:
            locs = [
                rel_def.name.get_location(),
                reldef_texts[reldef_text].get_location(),
            ]
            self.diag_store.add(
                diagnostics.E200(
                    reldef_text, "relation definition", location=locs[0], dupes=locs
                )
            )
            continue

        reldef_texts[reldef_text] = rel_def.name

        # Verify parameters.
        multi_value_params: Dict[str, List[relations.RelationDefParameter]] = defaultdict(
            list
        )  # Multi-value params in each direction.
        param_texts: Dict[str, "Token"] = {}  # Map of defined parameter names to their token.
        for param in rel_def.params:
            # Register multi-value params.
            if param.multi:
                multi_value_params[param.direction].append(param)

            # Check unique name.
            param_text = param.name.tok_text
            if param_text in param_texts:
                locs = [
                    param.name.get_location(),
                    param_texts[param_text].get_location(),
                ]
                self.diag_store.add(
                    diagnostics.E200(
                        param_text,
                        "parameter definition",
                        location=locs[0],
                        dupes=locs,
                    )
                )
                # Continue anyway

            param_texts[param_text] = param.name

            # check type.
            type_text = param.type_name.tok_text
            typedef = spec.types.get(type_text)
            if typedef is None:
                loc = param.type_name.get_location()
                self.diag_store.add(diagnostics.E203("type", name=type_text, location=loc))

                param.type = None
            else:
                param.type = typedef.type

        # Verify lack of more than one multi-value parameter in each direction.
        found_fatal = False
        for mv_params in multi_value_params.values():
            if len(mv_params) > 1:
                locs = [mvp.name.get_location() for mvp in mv_params]
                direction_text = {
                    INPUT: "requiring",
                    OUTPUT: "returning",
                    INPOUT: "relating",
                }[mv_params[0].direction]
                self.diag_store.add(
                    diagnostics.E213(
                        f"'{direction_text}' multi-value parameter",
                        len(mv_params),
                        "at most 1",
                        location=locs[0],
                        occurrences=locs,
                    )
                )
                found_fatal = True

        if found_fatal:
            continue  # Fatal error.

    spec.rel_defs = self.rel_defs
    self.rel_defs = None  # Avoid adding more relation definitions.
notify_new_section
notify_new_section(_new_top_section: bool)

Parser found a new section, drop all 'in-progress' relation definition construction.

Source code in src/raesl/compile/typechecking/reldef_builder.py
def notify_new_section(self, _new_top_section: bool):
    """Parser found a new section, drop all 'in-progress' relation definition
    construction.
    """
    self.current_reldef = None
    self.last_occurrences = {}
    self.current_direction = None
reldef_add_param
reldef_add_param(
    name: Token, type_name: Token, multi: bool
)

Add a parameter to the current relation definition.

Source code in src/raesl/compile/typechecking/reldef_builder.py
def reldef_add_param(self, name: "Token", type_name: "Token", multi: bool):
    """Add a parameter to the current relation definition."""
    assert self.current_direction is not None
    assert self.current_reldef is not None

    rel_param = relations.RelationDefParameter(name, type_name, self.current_direction, multi)
    self.current_reldef.params.append(rel_param)
reldef_param_header
reldef_param_header(header_tok: Token, direction: str)

New parameter subsection with a direction. Set the direction for the parameters that will follow.

Parameters:

Name Type Description Default
header_tok Token

Token of the direction, for deriving position information if needed.

required
direction str

Direction of the next parameters of the relation definition.

required
Source code in src/raesl/compile/typechecking/reldef_builder.py
def reldef_param_header(self, header_tok: "Token", direction: str):
    """New parameter subsection with a direction. Set the direction for the
    parameters that will follow.

    Arguments:
        header_tok: Token of the direction, for deriving position information if
            needed.
        direction: Direction of the next parameters of the relation definition.
    """
    assert self.current_reldef is not None

    last_occurrence = self.last_occurrences.get(direction)
    if last_occurrence is not None:
        direction_text = {
            INPUT: "require",
            OUTPUT: "returning",
            INPOUT: "relating",
        }[direction]
        locs = [header_tok.get_location(), last_occurrence.get_location()]
        self.diag_store.add(
            diagnostics.E200(direction_text, "parameter section", location=locs[0], dupes=locs)
        )
        # Continue anyway

    self.last_occurrences[direction] = header_tok
    self.current_direction = direction

type_builder

Class for constructing types.

TempBundleDef

TempBundleDef(bundle_name: Token)

Temporary storage of a bundle definition.

Source code in src/raesl/compile/typechecking/type_builder.py
def __init__(self, bundle_name: "Token"):
    self.bundle_name = bundle_name
    self.bundle_fields: List[TempFieldDef] = []

TempFieldDef

TempFieldDef(field_name: Token, type_name: Optional[Token])

Temporary storage of a field in a bundle.

Source code in src/raesl/compile/typechecking/type_builder.py
def __init__(
    self,
    field_name: "Token",
    type_name: Optional["Token"],
):
    self.field_name = field_name
    self.type_name = type_name

TempTypeDef

TempTypeDef(
    type_name: Token,
    parent_name: Optional[Token],
    enum_spec: Optional[List[Value]],
    unit_spec: Optional[List[Token]],
    ival_spec: Optional[
        List[Tuple[Optional[Value], Optional[Value]]]
    ],
    cons_spec: Optional[Value],
)

Temporary storage of a type definition.

Source code in src/raesl/compile/typechecking/type_builder.py
def __init__(
    self,
    type_name: "Token",
    parent_name: Optional["Token"],
    enum_spec: Optional[List[exprs.Value]],
    unit_spec: Optional[List["Token"]],
    ival_spec: Optional[List[Tuple[Optional[exprs.Value], Optional[exprs.Value]]]],
    cons_spec: Optional[exprs.Value],
):
    self.type_name = type_name
    self.parent_name = parent_name
    self.enum_spec = enum_spec
    self.unit_spec = unit_spec
    self.ival_spec = ival_spec
    self.cons_spec = cons_spec

TypeBuilder

TypeBuilder(ast_builder: AstBuilder)

Builder to construct types of the specification.

Parameters:

Name Type Description Default
ast_builder AstBuilder

AST builder instance.

required

Attributes:

Name Type Description
diag_store

Storage for found diagnostics.

type_defs List[TempTypeDef]

Found type definitions in the specification, temporarily stored until all type information is available.

bundle_defs List[TempBundleDef]

Found bundle definitions in the specification, temporarily stored until all type information is available.

current_bundle Optional[TempBundleDef]

If not None, reference to the last opened bundle definition for adding additional fields to it.

Source code in src/raesl/compile/typechecking/type_builder.py
def __init__(self, ast_builder: "AstBuilder"):
    # Make the builder problem store available locally.
    self.diag_store = ast_builder.diag_store

    # Setup type data storage.
    self.type_defs: List[TempTypeDef] = []
    self.bundle_defs: List[TempBundleDef] = []
    self.current_bundle: Optional[TempBundleDef] = None
    self.types_with_error: Optional[Dict[str, Token]] = (
        None  # typedefs that were not added setup during type construction.
    )

    ast_builder.register_new_section(self)
add_bundle_field
add_bundle_field(
    field_name: Token, type_name: Optional[Token]
)

A new field in the current bundle has been found by the parser, add it.

Source code in src/raesl/compile/typechecking/type_builder.py
def add_bundle_field(
    self,
    field_name: "Token",
    type_name: Optional["Token"],
):
    """A new field in the current bundle has been found by the parser, add it."""
    temp_fdef = TempFieldDef(field_name, type_name)
    assert self.current_bundle, "Trying to add a bundle field outside a type section."
    self.current_bundle.bundle_fields.append(temp_fdef)
add_standard_types staticmethod
add_standard_types(resolved_types: Dict[str, TypeDef])

Add all standard types to the resolved types.

Source code in src/raesl/compile/typechecking/type_builder.py
@staticmethod
def add_standard_types(resolved_types: Dict[str, types.TypeDef]):
    """Add all standard types to the resolved types."""
    for name in STANDARD_TYPE_NAMES:
        name_tok = scanner.Token("NAME", name, "ESL-compiler", 0, 0, 0)
        standard_type = types.ElementaryType(None, [], None)
        typedef = types.TypeDef(name_tok, standard_type)

        assert name not in resolved_types
        resolved_types[name] = typedef
add_typedef
add_typedef(
    type_name: Token,
    parent_name: Optional[Token],
    enum_spec: Optional[List[Value]],
    unit_spec: Optional[List[Token]],
    ival_spec: Optional[
        List[Tuple[Optional[Value], Optional[Value]]]
    ],
    cons_spec: Optional[Value],
)

The parser found a new type definition entry, store it.

Source code in src/raesl/compile/typechecking/type_builder.py
def add_typedef(
    self,
    type_name: "Token",
    parent_name: Optional["Token"],
    enum_spec: Optional[List[exprs.Value]],
    unit_spec: Optional[List["Token"]],
    ival_spec: Optional[List[Tuple[Optional[exprs.Value], Optional[exprs.Value]]]],
    cons_spec: Optional[exprs.Value],
):
    """The parser found a new type definition entry, store it."""
    temp_tdef = TempTypeDef(type_name, parent_name, enum_spec, unit_spec, ival_spec, cons_spec)
    self.type_defs.append(temp_tdef)
    self.current_bundle = None
check_unit
check_unit(
    value: Optional[Value],
    available_units: Dict[str, Token],
) -> None

Check whether the unit possibly specified in 'value' is available for use. Report an error if it is not available.

Source code in src/raesl/compile/typechecking/type_builder.py
def check_unit(self, value: Optional[exprs.Value], available_units: Dict[str, "Token"]) -> None:
    """Check whether the unit possibly specified in 'value' is available for use.
    Report an error if it is not available.
    """
    if value is None or value.unit is None:
        return

    units = value.get_units()
    if units is None:
        return

    if units.intersection(available_units):
        return  # Value uses a known unit.

    unit_text = value.unit.tok_text
    if unit_text.startswith("[") and unit_text.endswith("]"):
        unit_text = unit_text[1:-1]
    self.diag_store.add(diagnostics.E219(unit_text, location=value.unit.get_location()))
finish
finish(spec: Specification)

Check the collected types and bundles, report any errors, and add them to the specification.

Parameters:

Name Type Description Default
spec Specification

Specification to extend with the found types.

required
Source code in src/raesl/compile/typechecking/type_builder.py
def finish(self, spec: Specification):
    """Check the collected types and bundles, report any errors, and add them
    to the specification.

    Arguments:
        spec: Specification to extend with the found types.
    """
    self.current_bundle = None  # Likely not needed, but it's safe.

    orderer = Orderer()

    # Add type definitions.
    for tdef in self.type_defs:
        if tdef.parent_name is None:
            needs = []
        else:
            needs = [tdef.parent_name.tok_text]

        # Check for duplicate use of the name as a type.
        entry = orderer.find_entry(tdef.type_name.tok_text)
        if entry is not None:
            locs = [
                tdef.type_name.get_location(),
                TypeBuilder.get_entry_location(entry),
            ]
            self.diag_store.add(
                diagnostics.E200(
                    tdef.type_name.tok_text,
                    "type definition",
                    location=locs[0],
                    dupes=locs,
                )
            )
            continue

        orderer.add_dependency(tdef.type_name.tok_text, needs, tdef)

    # Add bundle definitions
    for bdef in self.bundle_defs:
        needs = set()
        field_names = {}
        for bfield in bdef.bundle_fields:
            # Verify unique field names.
            field_name = bfield.field_name.tok_text
            if field_name in field_names:
                locs = [
                    field_names[field_name].get_location(),
                    bfield.field_name.get_location(),
                ]
                self.diag_store.add(
                    diagnostics.E200(
                        field_name,
                        "Bundle field name",
                        location=locs[0],
                        dupes=locs,
                    )
                )
                # Continue anyway

            if bfield.type_name is not None:
                needs.add(bfield.type_name.tok_text)

        # Check for duplicate use of the name as a type.
        entry = orderer.find_entry(bdef.bundle_name.tok_text)
        if entry is not None:
            locs = [
                bdef.bundle_name.get_location(),
                TypeBuilder.get_entry_location(entry),
            ]
            # Type name is used for more than one type
            self.diag_store.add(
                diagnostics.E200(
                    bdef.bundle_name.tok_text, "type", location=locs[0], dupes=locs
                )
            )
            continue

        orderer.add_dependency(bdef.bundle_name.tok_text, list(needs), bdef)

    # Let the orderer decide on an order of processing, and process the result.
    resolved, cycle = orderer.resolve()

    self.types_with_error = {}
    """(str -> Token) Types that are defined in the spec but something was wrong."""

    resolved_types = {}
    """Dict of name-string to types.TypeDef."""

    # Make standard types generally available for the entire specification.
    TypeBuilder.add_standard_types(resolved_types)

    for entry in resolved:
        if entry.data is None:
            # Entry was created by the orderer, we should run into it again and fail
            # to find it.
            continue

        if isinstance(entry.data, TempTypeDef):
            self._make_typedef(entry.data, resolved_types)
        else:
            assert isinstance(entry.data, TempBundleDef)
            self._make_bundledef(entry.data, resolved_types)

    self.types_with_error = None

    if cycle:
        locs = [TypeBuilder.get_entry_location(entry) for entry in cycle]
        name = cycle[0].name
        self.diag_store.add(
            diagnostics.E204(name, "type definition", location=locs[0], cycle=locs)
        )

    # Store output
    spec.types = resolved_types
get_entry_location staticmethod
get_entry_location(entry: OrdererEntry) -> Location

Retrieve position information from a type entry.

Parameters:

Name Type Description Default
entry OrdererEntry

Entry to use.

required

Returns:

Type Description
Location

Position information of the entry.

Source code in src/raesl/compile/typechecking/type_builder.py
@staticmethod
def get_entry_location(entry: "OrdererEntry") -> "Location":
    """Retrieve position information from a type entry.

    Arguments:
        entry: Entry to use.

    Returns:
        Position information of the entry.
    """
    if isinstance(entry.data, TempTypeDef):
        return entry.data.type_name.get_location()
    else:
        assert isinstance(entry.data, TempBundleDef)
        return entry.data.bundle_name.get_location()
new_bundle_type
new_bundle_type(bundle_name: Token)

The parser found a new bundle in the source code, create it.

Source code in src/raesl/compile/typechecking/type_builder.py
def new_bundle_type(self, bundle_name: "Token"):
    """The parser found a new bundle in the source code, create it."""
    temp_bdef = TempBundleDef(bundle_name)
    self.bundle_defs.append(temp_bdef)
    self.current_bundle = temp_bdef
notify_new_section
notify_new_section(_new_top_section: bool)

Notification that type additions have finished.

Source code in src/raesl/compile/typechecking/type_builder.py
def notify_new_section(self, _new_top_section: bool):
    """Notification that type additions have finished."""
    self.current_bundle = None

type_checker

Type checking.

Type compatibility

Type 'sub_type' is compatible with type 'super_type' if - Type 'sub_type' is (possibly indirectly) derived from 'super_type'. - Type 'sub_type' has no additional value constraints in the form of enumerations, upper or lower limits, or constants relative to 'super_type'.

The former condition ensures the values are fundamentally compatible. This condition should always hold, the latter condition ensures that all possible values of 'super_type' can also be expressed in 'sub_type'. This is particularly relevant if 'sub_type' may receive data from the element with 'super_type'. If data flow is in the other direction only, the second condition seems less relevant.

Note that units are not relevant in this context. If the sub_type is a subtype of of 'super_type', the former always has all units of the latter.

Relevant code type-classes

Several classes have or use types, or represent data of some type. These classes are

  • raesl.compile.ast.types.ElementaryType (type of a single value).
  • raesl.compile.ast.types.Compound (type of a bundle of values).
  • raesl.compile.ast.nodes.ElementaryVarNode (data of an elementary type).
  • raesl.compile.ast.nodes.CompoundVarNode (data of a bundle).
  • raesl.compile.ast.nodes.GroupNode (data of a variable group).

where an ElementaryVarNode contains an ElementaryType, a CompoundVarNode contains a Compound (type), and GroupNode eventually always points at ElementaryVarNode or CompoundVarNode instances.

The entry_point 'check_type' accepts all the above kinds of objects.

TypeData

TypeData(
    name_tok: Token, suffixes: List[str], value: TypeValue
)

Tuple-like class to keep type data from one side together.

Parameters:

Name Type Description Default
name_tok Token

Initial name of the value in the source, and point of use of the type.

required
suffixes List[str]

Child texts after the initial name to indicate the subtype value being examined. See 'get_name' for a description of the suffixes.

required
value TypeValue

Type or node value being examined.

required
Source code in src/raesl/compile/typechecking/type_checker.py
def __init__(self, name_tok: "Token", suffixes: List[str], value: TypeValue):
    self.name_tok = name_tok
    self.suffixes = suffixes
    self.value = value
drop_to_type
drop_to_type()

For nodes, drop down to its type.

Source code in src/raesl/compile/typechecking/type_checker.py
def drop_to_type(self):
    """For nodes, drop down to its type."""
    if isinstance(self.value, ElementaryVarNode):
        return TypeData(self.name_tok, self.suffixes, self.value.the_type)
    elif isinstance(self.value, CompoundVarNode):
        return TypeData(self.name_tok, self.suffixes, self.value.the_type)
    else:
        return self
expand
expand()

Expand a sequence of values in 'self.value' to a sequence of child TypeData instances.

Source code in src/raesl/compile/typechecking/type_checker.py
def expand(self):
    """Expand a sequence of values in 'self.value' to a sequence of child TypeData
    instances.
    """
    if isinstance(self.value, Compound):
        childs = []
        for field in self.value.fields:
            child_suffixes = self.suffixes + [".{}".format(field.name.tok_text)]
            child = TypeData(self.name_tok, child_suffixes, field.type)
            childs.append(child)
        return childs

    else:
        assert isinstance(self.value, GroupNode)
        childs = []
        for i, child_node in enumerate(self.value.child_nodes):
            child_suffixes = self.suffixes + ["[{}]".format(i + 1)]
            child = TypeData(self.name_tok, child_suffixes, child_node)
            childs.append(child)
        return childs
get_name
get_name() -> str

Construct a human-readable name of the point being examined.

It consists of the initial name followed by zero or more suffixes, where a suffix can be one of the following - "." for a child field in a bundle, or - "[<1-based index>]" for a variable in a variable group.

Returns:

Type Description
str

The constructed name.

Source code in src/raesl/compile/typechecking/type_checker.py
def get_name(self) -> str:
    """Construct a human-readable name of the point being examined.

    It consists of the initial name followed by zero or more suffixes, where
    a suffix can be one of the following
    - ".<childname>" for a child field in a bundle, or
    - "[<1-based index>]" for a variable in a variable group.

    Returns:
        The constructed name.
    """
    return self.name_tok.tok_text + "".join(self.suffixes)

check_type

check_type(
    subtype: Tuple[Token, TypeValue],
    supertype: Tuple[Token, TypeValue],
    allow_subtype_limits: bool = False,
) -> Optional[EslDiagnostic]

Like 'check_type_unpacked', except the subtype and super-type data is packed in tuples.

Source code in src/raesl/compile/typechecking/type_checker.py
def check_type(
    subtype: Tuple["Token", TypeValue],
    supertype: Tuple["Token", TypeValue],
    allow_subtype_limits: bool = False,
) -> Optional[diagnostics.EslDiagnostic]:
    """Like 'check_type_unpacked', except the subtype and super-type data is packed in
    tuples.
    """
    subtok, subval = subtype
    supertok, superval = supertype
    return check_type_unpacked(subtok, subval, supertok, superval, allow_subtype_limits)

check_type_unpacked

check_type_unpacked(
    sub_nametok: Token,
    sub_value: TypeValue,
    super_nametok: Token,
    super_value: TypeValue,
    allow_subtype_limits: bool = False,
) -> Optional[EslDiagnostic]

Check whether sub_value is a subtype of super_value. Iff allow_subtype_limits holds, the sub_value may have additional value constraints. Returns None if the sub_value is indeed a subtype of super_value, possibly taking additional value constraints into account. Otherwise it returns a problem description of how it fails.

Parameters:

Name Type Description Default
sub_nametok Token

Text in the input of the sub_value, usually a variable.

required
sub_value TypeValue

Type or node of the sub value.

required
super_nametok Token

Text in the input of the super_value, usually a variable.

required
super_value TypeValue

Type or node of the super value.

required
allow_subtype_limits bool

Whether sub_value may have additional value constraints relative to super_value.

False

Returns:

Type Description
Optional[EslDiagnostic]

None if sub_value is a subtype of super_value taking allow_subtype_limits into account, else a problem description of how it fails to have the subtype relation.

Source code in src/raesl/compile/typechecking/type_checker.py
def check_type_unpacked(
    sub_nametok: "Token",
    sub_value: TypeValue,
    super_nametok: "Token",
    super_value: TypeValue,
    allow_subtype_limits: bool = False,
) -> Optional[diagnostics.EslDiagnostic]:
    """Check whether sub_value is a subtype of super_value. Iff allow_subtype_limits
    holds, the sub_value may have additional value constraints. Returns None if the
    sub_value is indeed a subtype of super_value, possibly taking additional value
    constraints into account. Otherwise it returns a problem description
    of how it fails.

    Arguments:
        sub_nametok: Text in the input of the sub_value, usually a variable.
        sub_value: Type or node of the sub value.
        super_nametok: Text in the input of the super_value, usually a variable.
        super_value: Type or node of the super value.
        allow_subtype_limits: Whether sub_value may have additional value constraints
            relative to super_value.

    Returns:
        None if sub_value is a subtype of super_value taking allow_subtype_limits
            into account, else a problem description of how it fails to have the
            subtype relation.
    """
    # Do some paranoia checks to ensure checking code does not crash on invalid data.
    assert sub_nametok is not None
    assert super_nametok is not None
    assert isinstance(
        sub_value,
        (ElementaryVarNode, CompoundVarNode, GroupNode, ElementaryType, Compound),
    ), "Weird sub value '{}'".format(sub_value)
    assert isinstance(
        super_value,
        (ElementaryVarNode, CompoundVarNode, GroupNode, ElementaryType, Compound),
    ), "Weird super value '{}'".format(super_value)

    subtype = TypeData(sub_nametok, [], sub_value)
    supertype = TypeData(super_nametok, [], super_value)
    return _check_type(subtype, supertype, allow_subtype_limits)

utils

Support functions.

construct_comp_instances_map

construct_comp_instances_map(
    comp_def: ComponentDefinition,
) -> Dict[str, ComponentInstance]

Construct a dict of child component instance names of the given component definition.

:param comp_def: Definition to search for available component instances. :return: Dictionary of component instance names to their instances.

Source code in src/raesl/compile/typechecking/utils.py
def construct_comp_instances_map(
    comp_def: "ComponentDefinition",
) -> Dict[str, "ComponentInstance"]:
    """
    Construct a dict of child component instance names of the given
    component definition.

    :param comp_def: Definition to search for available component instances.
    :return: Dictionary of component instance names to their instances.
    """
    compinsts = {}
    for cinst in comp_def.component_instances:
        compinsts[cinst.inst_name_tok.tok_text] = cinst
    return compinsts

construct_relinst_goal_transform_design_behavior_map

construct_relinst_goal_transform_design_behavior_map(
    comp_def: ComponentDefinition,
) -> Dict[
    str,
    Union[
        RelationInstance,
        Goal,
        Transformation,
        Design,
        BehaviorFunction,
    ],
]

Construct a dict to quickly find goals, transformations, designs, and behaviors by their label name.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Definition to search.

required

Returns:

Type Description
Dict[str, Union[RelationInstance, Goal, Transformation, Design, BehaviorFunction]]

Dictionary of labels to their goals, transformations, designs, and behaviors.

Source code in src/raesl/compile/typechecking/utils.py
def construct_relinst_goal_transform_design_behavior_map(
    comp_def: "ComponentDefinition",
) -> Dict[str, Union["RelationInstance", "Goal", "Transformation", "Design", "BehaviorFunction"],]:
    """Construct a dict to quickly find goals, transformations, designs, and behaviors
    by their label name.

    Arguments:
        comp_def: Definition to search.

    Returns:
        Dictionary of labels to their goals, transformations, designs, and behaviors.
    """
    label_map: Dict[
        str,
        Union["RelationInstance", "Goal", "Transformation", "Design", "BehaviorFunction"],
    ]
    label_map = {}
    for relinst in comp_def.relations:
        label_map[relinst.inst_name_tok.tok_text] = relinst
    for goal in comp_def.goals:
        label_map[goal.label_tok.tok_text] = goal
    for trans in comp_def.transforms:
        label_map[trans.label_tok.tok_text] = trans
    for design in comp_def.designs:
        label_map[design.label_tok.tok_text] = design
    for behavior in comp_def.behaviors:
        label_map[behavior.name_tok.tok_text] = behavior
    return label_map

construct_var_param_map

construct_var_param_map(
    comp_def: ComponentDefinition,
) -> Dict[str, VarParam]

Construct a dict of variable / parameter names to their definitions.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Definition to search for available variables and parameters.

required

Returns:

Type Description
Dict[str, VarParam]

Dictionary of names to their definitions.

Source code in src/raesl/compile/typechecking/utils.py
def construct_var_param_map(comp_def: "ComponentDefinition") -> Dict[str, "VarParam"]:
    """Construct a dict of variable / parameter names to their definitions.

    Arguments:
        comp_def: Definition to search for available variables and parameters.

    Returns:
        Dictionary of names to their definitions.
    """
    vps: Dict[str, "VarParam"] = {}  # Map of name to definition.
    for var in comp_def.variables:
        vps[var.name_tok.tok_text] = var
    for param in comp_def.parameters:
        vps[param.name_tok.tok_text] = param
    return vps

construct_vargroup_map

construct_vargroup_map(
    comp_def: ComponentDefinition,
) -> Dict[str, VariableGroup]

Construct a dict of variable groups names to their definitions.

Parameters:

Name Type Description Default
comp_def ComponentDefinition

Definition to search for available variables and parameters.

required

Returns:

Type Description
Dict[str, VariableGroup]

Dictionary of group names to their definitions.

Source code in src/raesl/compile/typechecking/utils.py
def construct_vargroup_map(
    comp_def: "ComponentDefinition",
) -> Dict[str, "VariableGroup"]:
    """Construct a dict of variable groups names to their definitions.

    Arguments:
        comp_def: Definition to search for available variables and parameters.

    Returns:
        Dictionary of group names to their definitions.
    """
    return dict((var.name_tok.tok_text, var) for var in comp_def.var_groups)

construct_verb_prepos_combis

construct_verb_prepos_combis(
    spec: Specification,
) -> Set[Tuple[str, str]]

Construct a set with all defined verb/prepos combinations.

Source code in src/raesl/compile/typechecking/utils.py
def construct_verb_prepos_combis(spec: "Specification") -> Set[Tuple[str, str]]:
    """
    Construct a set with all defined verb/prepos combinations.
    """
    return set((vpp.verb.tok_text, vpp.prepos.tok_text) for vpp in spec.verb_prepos)

resolve_var_param_group_node

resolve_var_param_group_node(
    name_tok: Token,
    avail_vps: Optional[Dict[str, VarParam]],
    avail_vgroups: Optional[Dict[str, VariableGroup]],
    reported_names: Set[str],
    diag_store: DiagnosticStore,
) -> Optional[Node]

Resolve the provided (possibly dotted) name to a node from a variable, parameter or variable group.

Parameters:

Name Type Description Default
name_tok Token

Name of the node to obtain, may contain a dotted name.

required
avail_vps Optional[Dict[str, VarParam]]

Available variables and parameters, may be None.

required
avail_vgroups Optional[Dict[str, VariableGroup]]

Available variable groups, may be None.

required
reported_names Set[str]

Non-existing variables, parameters, and groups that are reported already.

required
diag_store DiagnosticStore

Destination for found diagnostics.

required

Returns:

Type Description
Optional[Node]

The node represented by the name. It can be a Node if the name points at a variable groups. It is always a VarNode if the nam points at a variable or parameter.

Source code in src/raesl/compile/typechecking/utils.py
def resolve_var_param_group_node(
    name_tok: "Token",
    avail_vps: Optional[Dict[str, "VarParam"]],
    avail_vgroups: Optional[Dict[str, "VariableGroup"]],
    reported_names: Set[str],
    diag_store: diagnostics.DiagnosticStore,
) -> Optional["Node"]:
    """Resolve the provided (possibly dotted) name to a node from a variable, parameter
    or variable group.

    Arguments:
        name_tok: Name of the node to obtain, may contain a dotted name.
        avail_vps: Available variables and parameters, may be None.
        avail_vgroups: Available variable groups, may be None.
        reported_names: Non-existing variables, parameters, and groups that are
            reported already.
        diag_store: Destination for found diagnostics.

    Returns:
        The node represented by the name. It can be a Node if the name points at a
            variable groups. It is always a VarNode if the nam points at a variable or
            parameter.
    """
    first_part = get_first_namepart(name_tok.tok_text)
    if avail_vgroups is not None:
        vgrp = avail_vgroups.get(first_part)
        if vgrp is not None:
            i = name_tok.tok_text.find(".")
            if i >= 0:
                # It is a dotted name, which is not allowed in a group.
                # Report an error if not done already.
                if first_part not in reported_names:
                    reported_names.add(first_part)
                    diag_store.add(
                        diagnostics.E224(
                            "variable group",
                            f"selections like '{name_tok.tok_text[i:]}'",
                            location=name_tok.get_location(len(first_part)),
                        )
                    )
                    return None

            return vgrp.node

    # No variable groups, or no match, try variable or parameters.
    if avail_vps is not None:
        varparam = avail_vps.get(first_part)
        if varparam is not None:
            node = varparam.resolve_node(name_tok.tok_text)
            if node is not None:
                return node

            # varparam.resolve_node failed. Do report multiple times for the same
            # var/param for different dotted suffixes.
            if name_tok.tok_text not in reported_names:
                reported_names.add(name_tok.tok_text)

                kind = {True: "variable", False: "parameter"}[varparam.is_variable]
                offset = varparam.get_error_position(name_tok.tok_text)
                # Cannot resolve part of a dotted name.
                diag_store.add(
                    diagnostics.E225(
                        name_tok.tok_text[offset:],
                        first_part,
                        kind,
                        location=name_tok.get_location(offset),
                    )
                )

            return None

    # Name does not exist.
    if first_part not in reported_names:
        reported_names.add(first_part)

        if avail_vps is not None:
            if avail_vgroups is not None:
                kind = "variable, parameter, or variable group instance"
            else:
                kind = "variable or parameter instance"
        else:
            assert avail_vgroups is not None, "Must have at least one set of names."
            kind = "variable group instance"
        diag_store.add(diagnostics.E203(kind, name=first_part, location=name_tok.get_location()))

    return None

resolve_var_param_node

resolve_var_param_node(
    name_tok: Token,
    avail_vps: Dict[str, VarParam],
    reported_names: Set[str],
    diag_store: DiagnosticStore,
) -> Optional[VarNode]

Resolve the (possibly sub)node of a variable or parameter indicated by 'name'. If it fails, report an error if necessary.

Parameters:

Name Type Description Default
name_tok Token

Name of the node to obtain, may contain a dotted name.

required
avail_vps Dict[str, VarParam]

Variables and parameters available in the context.

required
reported_names Set[str]

Non-existing names and prefixes that are reported already.

required
diag_store DiagnosticStore

Storage for found diagnostics.

required

Returns:

Type Description
Optional[VarNode]

The node represented by the name, or None if it could not be found. In the latter case, an problem exists indicating failure to find the node.

Source code in src/raesl/compile/typechecking/utils.py
def resolve_var_param_node(
    name_tok: "Token",
    avail_vps: Dict[str, "VarParam"],
    reported_names: Set[str],
    diag_store: diagnostics.DiagnosticStore,
) -> Optional["VarNode"]:
    """Resolve the (possibly sub)node of a variable or parameter indicated by 'name'.
    If it fails, report an error if necessary.

    Arguments:
        name_tok: Name of the node to obtain, may contain a dotted name.
        avail_vps: Variables and parameters available in the context.
        reported_names: Non-existing names and prefixes that are reported already.
        diag_store: Storage for found diagnostics.

    Returns:
        The node represented by the name, or None if it could not be found.
            In the latter case, an problem exists indicating failure to find the node.
    """
    node = resolve_var_param_group_node(name_tok, avail_vps, None, reported_names, diag_store)
    if node is None:
        return None
    assert isinstance(node, VarNode)
    return node

split_arguments

split_arguments(
    params_length: int,
    multiple_index: Optional[int],
    arguments: List[Token],
) -> List[List[Token]]

Given a list arguments, split them into 'params_length' pieces, where each piece has length 1, except piece 'multiple_index' if not None, which takes all the slack.

Parameters:

Name Type Description Default
params_length int

Number of pieces in the result.

required
multiple_index Optional[int]

Index in 'arguments' where the multi-piece starts, only if multiple_index is not None.

required
arguments List[Token]

Actual arguments to split in pieces.

required
Source code in src/raesl/compile/typechecking/utils.py
def split_arguments(
    params_length: int, multiple_index: Optional[int], arguments: List["Token"]
) -> List[List["Token"]]:
    """Given a list arguments, split them into 'params_length' pieces, where each piece
    has length 1, except piece 'multiple_index' if not None, which takes all the slack.

    Arguments:
        params_length: Number of pieces in the result.
        multiple_index: Index in 'arguments' where the multi-piece starts, only if
            multiple_index is not None.
        arguments: Actual arguments to split in pieces.
    """
    if multiple_index is None:
        assert len(arguments) == params_length
        return [[arg] for arg in arguments]

    assert len(arguments) >= params_length
    assert multiple_index >= 0
    assert multiple_index < params_length
    num_singular = params_length - 1
    length_multi = len(arguments) - num_singular
    after_multiple = multiple_index + length_multi

    return (
        [[arg] for arg in arguments[:multiple_index]]
        + [arguments[multiple_index:after_multiple]]
        + [[arg] for arg in arguments[after_multiple:]]
    )

verb_builder

Class to store and check verb / pre-position definitions.

VerbDefBuilder

VerbDefBuilder(ast_builder: AstBuilder)

Part of the builders to deal with verbs / pre-positions.

Source code in src/raesl/compile/typechecking/verb_builder.py
def __init__(self, ast_builder: "AstBuilder"):
    # Make the builder problem store available locally.
    self.diag_store = ast_builder.diag_store

    # Setup local storage
    self.storage: Dict[Tuple[str, str], List[verbs.VerbPreposDef]]
    self.storage = collections.defaultdict(list)
add_verbdef
add_verbdef(verb_tok: Token, prepos_tok: Token)

Store the provided verb/prepos combination.

Source code in src/raesl/compile/typechecking/verb_builder.py
def add_verbdef(self, verb_tok: "Token", prepos_tok: "Token"):
    """Store the provided verb/prepos combination."""

    # For error reporting, order storage by the verb and prepos text.
    key = (verb_tok.tok_text.lower(), prepos_tok.tok_text.lower())
    vdef = verbs.VerbPreposDef(verb_tok, prepos_tok)
    self.storage[key].append(vdef)
finish
finish(spec: Specification)

Finish collecting by checking the collected verb-prepositions. Store result in the provided specification.

Source code in src/raesl/compile/typechecking/verb_builder.py
def finish(self, spec: "Specification"):
    """Finish collecting by checking the collected verb-prepositions. Store result
    in the provided specification.
    """
    for verb_prepos_text, verbdefs in self.storage.items():
        if len(verbdefs) != 1:
            # More than one definition of the same verb/prepos combination.
            dupes = [vd.verb.get_location() for vd in verbdefs]
            self.diag_store.add(
                diagnostics.W200(
                    " ".join(verb_prepos_text),
                    "verb-preposition combination",
                    location=dupes[0],
                    dupes=dupes,
                )
            )

    # spec.verb_prepos = [verbdefs[0] for verbdefs in self.storage.values()]
    # mypy generates false positive on the list comprehension
    spec.verb_prepos = []
    for verbdefs in self.storage.values():
        spec.verb_prepos.append(verbdefs[0])