The NFDCG Approach: Systematic Management of Non-Functional Properties

NFDCG refers to Non-Functional Dependency Composition Graph and is a meta-approach for composing non-functional properties (NFPs) and deciding about adequate execution alternatives at the building block level. Composing a NFP of a building block has a quite abstract meaning and is defined as determining a representative value for that NFP. This value determination is not always trivial because NFPs often have a lot of variability dependencies. Variability regarding a NFP of a building block may arise due to the following reasons:

  • A building block contains a set of configuration parameters that offer certain variability ranges for their value bindings
  • A building block is functionally composed of other building blocks and there is variability in form of functional equivalent building blocks
  • The NFP of a building block depend on specific context that have certain variability ranges for their possible states/value bindings

The first two points are forms of active variability from which a robot can choose from. The last point is a form of passive variability and therefore conditions under which a robot may have to operate. If we compose a NFP of a building block based on its variability dependencies, we can lookup what value we can achieve for all combinations of active and passive variability. When we apply this procedure for every NFP of every building block in our system we can theoretically represent the whole non-functional problem space of our system.

Meta Level of the NFDCG Approach

The PDO Principle

From a mathematical viewpoint, the whole problem space can be represented as a graph connecting different local graph patterns according to the following principle: An output node depends on other input nodes and a special composition operator maps the variability ranges of the input nodes to the variability range of the output node. The nodes of the graph are special value providers for any relevant entity impacting the composition of a NFP or the overall decision-making. The meta model defines this principle based on a triple relation between three abstract basic meta-types: provider, dependency and operator.

There are several subtypes of a provider and an operator. A provider subtype has special dependency constraints to other provider subtypes. A valid provider-dependency subtype combination has specific assignment constraints of operator subtypes. The current set of subtypes, their relations and constraints are shown in the following meta-model:


  • NFP: A NFP entity provides any kind of non-functional information at a certain abstraction level, in a specific representation and can have different dependencies
  • Parameter: A parameter is a variation point - a source of active variability that is indirectly reflected in the modeled NFP entities that depend on them. It represents a continuous numerical value with a specific, constrainable range and a definable discretization size.
  • Conflict: A conflict entity represents a conflict of objectives between the NFPs modeled as dependencies of the conflict.
  • Constraint Specification (CSpec): This entity is the provider of a non-functional requirement specification in form of a constraint. It can be formulated to a NFP that is modeled in an absolute representation. Requirements may not always be constant values, but can have dependencies as well.
  • Weight Specification (WSpec): This entity is the provider of weights. It is either linked to the dependencies of a conflict (preferences) or to a relative composition between NFPs (relative impact factor).


  • Relative Mapping (RM): Relative Mapping represents a function f () that maps a set of dependencies to a normalized value of the NFP under consideration. It can be used to describe a relative utility function if not all dependencies can be taken into account and/or the absolute composition function is unknown.
  • Absolute Composition (AC): The absolute composition operator composes the modeled dependencies to an absolute value in the respective unit of the provider under consideration. It achieves the most useful representation, but the function f () can become arbitrarily complex depending on the provider type and the quality of the function. It may be realized by a closed-form expression using a set of well-known operators and functions or needs more complex analysis methods. The operator can have a precondition p to simplify the compositional complexity by shifting the dependencies from f () to p. The composition is consistent when p is fulfilled, otherwise the outcome is unclear.
  • Weighted Sum (WS): This operator calculates the weighted sum of the dependencies.
  • Constraint (C): The constraint operator restricts the allowed scope of a provider entity to a given range
  • Collect-Aggregate (CA): The CA operator collects resulting values from a discrete set of competing functionally equivalent building block alternatives and aggregates them.


We use the well-known Block-Port-Connector modeling scheme to cope with separation of roles and responsibilities:

  • Each non-functional building block provides relevant non-functional information of a self-contained responsibility via the Output ports.
  • A non-functional building block is a collection and composition of different providers. Input ports allow to define an external provider dependency. The concrete values are then provided by the connected Output port of another building block.
  • A Fork variation points is a special kind of a building block to manage the combinatorics presentation of variants and results.
  • Composability of concrete Provider type instances (e.g. Time as a NFP instance) can be achieved by referencing prespecified types from a repository defining aspects such as datatype and unit (not shown here)


When the problem space is modeled in a NFDCG instance model, we will get a solution space for all execution alternatives (active variability dependencies) by resolving the model for the current bindings of passive variability dependencies (context) and the requirement specification. There are two solution types to determine a final solution from the solution space: (i) Satisfaction only, i.e. accept any variant that fulfills all constraints and (ii) Satisfaction + Optimization, i.e. additionally maximize the utility depending on the conflicts and the specified preferences. Our meta-operator for a single conflict is currently a simple weighted sum approach. The overall utility for multiple conflicts is given by summing the weighted sums of all conflicts.

Functionality and Variation Points

The functionality of the presented use-case is a flexible navigation stack ( to realize a GOTO(Start, Goal) capability. We assume that this system offers 4 variation points that do not change the functionality but impact the NFPs of the system:

  • Velocity: The maximum velocity (in mm/s) the obstacle avoidance component is allowed to command to the mobile base that finally executes the movement. It is of the type Parameter with a specific value scope depending on its minimium and maximum value and a user-defined discretization size. The maximum value depends on the physical limit of the used mobile base.
  • Navigation Strategy: A strategy that selects the motion command from a set of allowable curvatures for a specific state of the environment and robot at time t. This is done by a specific algorithm with a set of metrics evaluated by an objective function. This Variation Point is of type Fork where one out of two discrete alternatives can be selected and is part of the obstacle avoidance component.
  • Local Environment Representation/Sensor Variant: The sensor variant used to determine the local environment representation. In our system, we have two variants: 2D Laser and Camera. This Fork Variation Point represents alternative execution paths at the building block level of software components with respect to the given functional modularization.
  • Max CPU Load: This Variation Point of type Parameter allows to specify the maximum allowed CPU load (100% means no CPU load restriction).

The NFDCG Model

The following figure shows the graphical NFDCG model created with Sirius/Eclipse. We introduced different building blocks: (i) There are building blocks for each software component of the functional view that provide relevant non-functional information: ObstacleAvoidance, MobileBase, Laser, Camera, CameraToLaser. (ii) We defined the building block GOTO to represent the more abstract, overall behavior of the composite skill and its non-functional properties. (iii) We also defined building blocks to envelope and group together certain non-functional concerns that can be addressed in isolation by a corresponding role/expert to reduce the overall complexity such as requirements specification, performance analysis and provision of relevant context data. These are rather steps of the development workflow than building blocks in the sense of (i) and (ii). Nevertheless, we currently treat them as building blocks as well with the advantage of having all relevant aspects collected in one model. (iv) Finally, we use special fork building blocks to represent a Fork Variation Point.


This building block is associated with the functional software component of the used mobile base hardware. Relevant NFP that this building block offers are the maximum velocity and the acceleration deceleration property of the hardware.


This building block is associated with the functional software component of the used laser hardware. Relevant providers in this example are:

  • Accuracy: Refers to the deviation in mm. Is assumed to be a static value in this example.
  • Frequency: The frequency parameter range of the task providing the laser scan data. The maximum value is restricted by the hardware.
  • Dimension: Dimensionality of the provided representation. This is 2D for the laser hardware represented by this building block.


This building block is associated with the functional software component of the used camera hardware. Relevant providers in this example are:

  • Accuracy: Refers to the deviation in mm. Is assumed to be a static value in this example.
  • Frequency: The frequency parameter range of the task providing the image data. The maximum value is restricted by the hardware.
  • Dimension: Dimensionality of the provided representation. This is 3D for the camera hardware represented by this building block.


Maps the camera data to 2D laser data, but keeps some information a 2D laser would not be able to see. It can therefore be treated as a kind of 2.5D representation of the local environment.


The functional counterpart is responsible for selecting local curvatures to avoid obstacles and to approach the next waypoint on the route to the final goal. It has the Velocity and the Navigation Strategy Variation Points. The two navigation strategy variants are modeled as inner building blocks and are associated with the conflicting properties of execution time and safety distance. We assume that Strategy_B considers an additional safety distance metric next to velocity (tries to approximate the Velocity Parameter as good as possible) and heading (tries to approximate the orientation to the next waypoint as good as possible) when evaluating curvatures. The safety distance metric at time t is defined by the minimum lateral distance to the obstacles along the whole curvature at time t. This requires additional calculations each time increasing the performance of this execution variant in contrast to Strategy_A that only considers the velocity and heading criteria. Note, that the problem of selecting curvatures regarding different criteria is a conflict of objectives per se, but the availability of two different implementations adds a next level of conflict. The quantification of both properties is not trivial and is thus argued as follows:

  • Execution Time: Predicting performance values for arbitrary CPUs without executing and measuring it, is a quite complex issue. We therefore assume that the provider of this component uses a precondition together with an absolute value assignment (see AC pattern) to make the description consistent. The precondition refers to the specific CPU type, he was using when measuring the performance of both variants. This precondition may reduce the reusability of the non-functional description drastically at the first sight because potential users with different CPUs can not rely on the given value and must determine it themselves for their own context. However, identical variants of a commercial robot system are delivered with the same CPU, thereby reducing the amount of CPU variability for different users when established, popular robot systems are in widespread use.
  • Safety Distance: The Safety Distance is a quite dynamic metric with a general dependency on the concrete curvature at time t. Due to this run-time dependencies on the robot- and environment state it is impossible to determine the value without evaluating the available curvatures at each time t. Hence, if we want to decide on the most adequate curvature depending on the absolute curvature criterions for both strategies, we would need to execute both variants and then decide based on the available non-functional curvature information and the current non-functional requirements. However, this would annul our general Execution Time-Safety Distance conflict. For that reason and because of complexity reasons, the Safety Distance is described only in a relative from (RM pattern). We directly use the internal weighting for the safety distance metric as the representative relative value of the strategy. Since Strategy A is considering Safety Distance not at all, the value is 0.0. For Strategy B we assume that the objective function uses equal weighting for all 3 criteria (Heading = 0.33, Velocity = 0.33, Safety Distance = 0.33). Hence safety distance for Strategy B is represented by 0.33.

The local conflict here is the chance for more safety distance at the expense of a higher execution time. Since the local execution time here is not very meaningful, the conflict is modeled at a higher level of abstraction (see later section about GOTO building block).


The mentioned discrete Sensor variation point is modeled by a Fork Building Block. The two alternative execution paths are (i) Laser and (ii) Camera + CameraToLaser before they merge in the ObstacleAvoidance. The special CA (Collect-Aggregate) entities of a fork building block allow us to collect alternative results for a specific property that are modeled in separate building blocks and aggregate them to a single output that provides a variability vector that can be traced back to the different variants. This way, we can summarize all relevant NFPs affected by this Variation Point and provide conflicting values. In this example and for that Variation Point, we expect multiple values regarding the following NFPs:

  • Accuracy: Different values depending on the sensor type and the concrete hardware. Lasers are typically more accurate than cameras (when neglecting special environment context dependencies such as fog, illumination level etc.). Note that such aspects could be added as well to provide more detailed relations. But for simplicity, we assume static values here that are provided by the Laser and Camera component as mentioned earlier.
  • Worst Case Performance (Response Time and CPU Utilization): We have to compare the local performance of the variants, i.e. the performance from capturing sensor data by the respective hardware to the arrival of the data in the ObstacleAvoidance. The CA entity of this building block only collects and aggregates the different values from separate building blocks that address the performance related composition analysis (see next section).
  • Visibility: This property refers to the fact that a 2D laser is not able to see every kinds of objects in the environment because it uses only a flat beam at a certain height where the device is mounted to scan the environment. We assume that the obstacles are mapped to a 2D grid map. If we know the actual height for each obstacle cell, a 2D laser can not see those cells whose height is lower than the height where the sensor device is mounted. The Visibility metric therefore calculates the number of visible cells of the robots current local environment depending on the sensor dimensionality and the height list for each obstacle cell that is assumed to be known in advance. Since this metric is associated with the occupancy grid map and in our system the ObstacleAvoidance is the component who provides this functionality, one could argue that the ObstacleAvoidance should be the provider of this NFP. However, in this example we decided to model this NFP and its composition function here because we assume it is the main reason why this variation point was added in the system by the system builder.


These building blocks actually envelope a separate activity for a compositional performance analysis that is of big complexity. In general it is the provider of the end-to-end performance (worst case response time + cpu utilization) of the sensor-actuator loop for the Laser and the Camera variant respectively. The dependencies are among others the execution times of the involved tasks of the components along the sensor-actuator loop, and their individual activation sources that are also Variation Points (e.g. the frequency of a periodic timer is a Parameter that can be configured within a specific scope, see Camera and Laser). All the dependencies can have an impact to the resulting performance values and the composition function is represented by a SymTA/S analysis. For more information see[]=symta. At the moment, all those relevant aspects are explicated in separate meta-models and the concrete models must be specified manually using an own DSL. Hence, this building block represents a placeholder at the moment. We need domain-specific extensions in the future to generate models for performance analysis from all the relevant dependencies and to execute them automatically as part of the global NFDCG resolution process. Nevertheless, we can again explicate absolute values here and make them consistent with an appropriate precondition. For example: A user of this system with the same CPU can rely on the values when using a specific configuration setting for which the values were determined and must not perform a separate analysis. Note, that the output ports of both properties (response time/cpu utilization) will provide vectors (two values) because they depend on the execution times of the two available navigation strategies in the ObstacleAvoidance.

CPU Allocation

This is the place where we model the max cpu load variation point. We define the CPU Utilization property as the relevant dependency and simply set the variation point to this value (they have the same unit). It is not a problem that CPU Utilization is actually a vector of values because the generation and resolution process will take care that the assignment always refers to the current instance under evaluation. This leads to the advantage that the robot will adapt its CPU load to exactly the value that is necessary to fulfill the requirements of the current selected execution variant. Hence, the robot will save resources automatically that are not required in certain situations. Note, that this is a special Variation Point in contrast to the others, because it depends directly on the outcome of another non-functional resource and has no independent evaluation path for its scope.

Context Providers

This building block is responsible to provide relevant context data for the NFDCG or at least to link to the providing source. The following context data are relevant for our example:

  • BatteryLevel: The current battery level of the robot in percent. This information is provided by a service of the mobile base. This information is available in the example but not used.
  • TransportsFilledCup: A flag that indicates if the robot is currently transporting a filled cup when this system/skill is executed. This information can be retrieved from the knowledgebase of the robot.
  • LaserMountHeight: The height where the laser is mounted on the robot
  • CellHeightList: The height list for each static obstacle cell in the current local environment of the robot. We assume that the local environment is defined by distinct semantic locations on the overall map such as region-A, region-B etc.

Domain-specific context information and their properties needs to be investigated and can then be standardized within an ecosystem. For example, there may be standardized definitions and models that describe all relevant information about the assembly of the robot. From that we can finally retrieve the mentioned laser mount height.


This building block represents the overall functionality of driving from one start location to a goal location at the process level. It is a more abstract building block because it is a composite realization of the functional components. It follows, that the non-functional aspects modeled in this non-functional building block container are equally more abstract. We consider the following non-functional poperties:

  • Time: Refers to the time needed to reach the goal location. We use RM and define a linear relation to Velocity. The higher the velocity, the higher the utility with respect to time.
  • Maximum Braking Distance: The maximum distance a robot still travels up to standstill after it decided to stop. A safety concern that may lead to undesired collisions in dynamic environments. This property can be composed absolute using the given formula and depends on the Velocity, the Response Time of the sensor-actuator loop and the acceleration deceleration of the mobile base.
  • Spill Risk: The higher the velocity, the higher the spill risk (the lower the utility w.r.t Spill Risk) but of course only if the robot transports a filled cup.
  • Energy (MotorEnergy, CPUUtilization): We modeled two relevant dependencies: (i) energy consumption by the motors and (ii) energy consumption by CPU utilization. For the former we use RM(Velocity) and for the latter we have the absolute values. But due to the former relative dependency (modeling energy consumption absolute is hard in general but especially for this navigation architecture), the superordinate Energy property can not be modeled absolute. Hence, we use the relative WS pattern and adjust the weights to declare the dominating factor (MotorEnergy).

Finally, we model our Conflicts in this container. We use the more abstract normalized braking distance value in each conflict instead of its lower-level dependencies (Velocity and Performance), the ones that are actually affected directly by the Variation Points. But since braking distance is composed absolute, its normalized value is able to reflect the partial influences of the Variation Points via the lower-level dependencies. This is also the reason why we model the conflict for the Navigation Strategy Variation Point explicitly here: In the ObstacleAvoidance component the SafetyDistance/ExecutionTime-conflict is only local and the latter refers to the execution time of the individual strategies - a value that is actually meaningless. However, the value is gaining importance as a partial influence factor to a more abstract property of a more abstract functionality (BrakingDistance/GOTO). In this sense, the local conflict is taken up from the ObstacleAvoidance component and then mapped to a higher level of abstraction.

Requirement Specification

We assume that non-functional requirements can be specified by the designer at design-time and can later be adapted by the the enduser at run-time within allowed limits. Safety constraints defined at design-time will typically have priority. The possible definable constraints depend on the properties that are represented as absolute values in our model. In this example, the designer specifies a maximum constraint value for the property BrakingDistance that depends on the current daytime. He also specifies initial preferences for the conflicts but they can be changed dynamically at run-time according to individual requirements of endusers. An enduser is either a person that directly specifies the requirements or a higher-level planning system that determines local requirements from global requirements and the current execution context. For example: Assume, we have a prediction model for the time property of this skill (i.e. it is represented in an absolute form). The GOTO skill is part of a task (e.g. Get object) to which an end-to-end (global) constraint was formulated (e.g. finish task within x minutes).

nfp-management/start.txt · Last modified: 2022/12/23 11:06 by

DokuWiki Appliance - Powered by TurnKey Linux