nfp-modeling:start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
nfp-modeling:start [2020/02/26 18:57]
Timo Blender [Distribution (Block-Port-Connector)]
nfp-modeling:start [2020/02/27 10:06] (current)
Timo Blender [Modeling and Composing Non-Functional Properties: Further Details]
Line 1: Line 1:
-====== ​NFP Modeling ​fixme ======+====== ​A New Approach for Modeling ​and Composing Non-Functional Properties: Realizing Flexible and Adequate Service Robot Behavior: Further Details ​======
  
-This article ​introduces a methodology ​for modeling ​and composing non-functional properties (NFPs). Describing building blocks regarding their NFPs can become a complex issue because there may be several dependencies that must be taken into account. Hence, static NFP descriptions are not not always a meaningful choice. We propose the //NFCC// approach - a //​Non-Functional Composition Chain// that captures the dependency relations of NFPs. From a general point of view, the NFCC is a dependency graph between different resources linked via special calculation blocks.+This article ​provides additional details ​for the paper //A New Approach for Modeling ​and Composing Non-Functional Properties: Realizing Flexible and Adequate Service Robot Behavior// submitted at IROS 2020. 
 + 
 +Describing building blocks regarding their NFPs can become a complex issue because there may be several dependencies that must be taken into account. Hence, static NFP descriptions are not not always a meaningful choice. We propose the //NFCC// approach - a //​Non-Functional Composition Chain// that captures the dependency relations of NFPs. From a general point of view, the NFCC is a dependency graph between different resources linked via special calculation blocks.
  
 ===== The Meta-Level of the Approach ===== ===== The Meta-Level of the Approach =====
Line 25: Line 27:
   * A Fork variation points is a special kind of a building block to manage the combinatorics presentation of variants and results.   * A Fork variation points is a special kind of a building block to manage the combinatorics presentation of variants and results.
  
-===== Use-Case =====+===== Navigation ​Use-Case =====
 ==== Functionality and Variation Points ==== ==== Functionality and Variation Points ====
  
Line 31: Line 33:
  
   * **Velocity:​** The maximum velocity (in mm/s) the obstacle avoidance component is allowed to command to the mobile base that finally executes the movement. It is of the type Parameter with a specific value scope depending on its minimium and maximum value and a user-defined discretization size. The maximum value depends on the physical limit of the used mobile base.   * **Velocity:​** The maximum velocity (in mm/s) the obstacle avoidance component is allowed to command to the mobile base that finally executes the movement. It is of the type Parameter with a specific value scope depending on its minimium and maximum value and a user-defined discretization size. The maximum value depends on the physical limit of the used mobile base.
-  * **Navigation Strategy:** A strategy that selects the motion command from a set of allowable curvatures for a specific state of the environment and robot at time //t//. This is done by a specific algorithm with a set of metrics evaluated by an objective function. This Variation Point is of type Fork where one out of two discrete alternatives can be selected and is part of the SmartCdlServer.+  * **Navigation Strategy:** A strategy that selects the motion command from a set of allowable curvatures for a specific state of the environment and robot at time //t//. This is done by a specific algorithm with a set of metrics evaluated by an objective function. This Variation Point is of type Fork where one out of two discrete alternatives can be selected and is part of the obstacle avoidance component.
   * **Local Environment Representation/​Sensor Variant:** The sensor variant used to determine the local environment representation. In our system, we have two variants: 2D Laser and Camera. This Fork Variation Point represents alternative execution paths at the building block level of software components with respect to the given functional modularization.   * **Local Environment Representation/​Sensor Variant:** The sensor variant used to determine the local environment representation. In our system, we have two variants: 2D Laser and Camera. This Fork Variation Point represents alternative execution paths at the building block level of software components with respect to the given functional modularization.
   * **Max CPU Load:** This Variation Point of type Parameter allows to specify the maximum allowed CPU load (100% means no CPU load restriction).   * **Max CPU Load:** This Variation Point of type Parameter allows to specify the maximum allowed CPU load (100% means no CPU load restriction).
Line 37: Line 39:
 ==== The NFCC Model ==== ==== The NFCC Model ====
  
-The following figure shows the graphical NFCC model created with Sirius/​Eclipse. We introduced different building blocks: (i) There are building blocks for each software component of the functional view that provide relevant non-functional information:​ //SmartCdlServer// (realizes //Obstacle Avoidance//​), //SmartBaseDriver// (realizes //Base Driver//), //SmartLaser//​ (realizes //2D Laser Driver//)//​SmartCamera//​ (realizes ​//​Camera ​Driver//), //SmartCameraToLaser//​ (realizes //Camera To 2D Laser//). (ii) We defined the building block //GOTO// to represent the more abstract, overall behavior of the composite skill and its non-functional properties. (iii) We also defined building blocks to envelope and group together certain non-functional concerns that can be addressed in isolation by a corresponding role/expert to reduce the overall complexity such as requirements specification,​ performance analysis and provision of relevant context data. These are rather steps of the development workflow than building blocks in the sense of (i) and (ii). Nevertheless,​ we currently treat them as building blocks as well with the advantage of having all relevant aspects collected in one model. (iv) Finally, we use special fork building blocks to represent a Fork Variation Point.+The following figure shows the graphical NFCC model created with Sirius/​Eclipse. We introduced different building blocks: (i) There are building blocks for each software component of the functional view that provide relevant non-functional information:​ //ObstacleAvoidance//, //MobileBase//, //Laser//, //Camera//, //CameraToLaser//. (ii) We defined the building block //GOTO// to represent the more abstract, overall behavior of the composite skill and its non-functional properties. (iii) We also defined building blocks to envelope and group together certain non-functional concerns that can be addressed in isolation by a corresponding role/expert to reduce the overall complexity such as requirements specification,​ performance analysis and provision of relevant context data. These are rather steps of the development workflow than building blocks in the sense of (i) and (ii). Nevertheless,​ we currently treat them as building blocks as well with the advantage of having all relevant aspects collected in one model. (iv) Finally, we use special fork building blocks to represent a Fork Variation Point.
  
-{{ :​nfp-modeling:​nfcc-model-12-02.svg.png |}}+{{ :​nfp-modeling:​model.png |}}
  
-=== SmartBaseDriver ​===+=== MobileBase ​===
  
 This building block is associated with the functional software component of the used mobile base hardware. Relevant NFP that this building block offers are the maximum velocity and the acceleration deceleration property of the hardware. This building block is associated with the functional software component of the used mobile base hardware. Relevant NFP that this building block offers are the maximum velocity and the acceleration deceleration property of the hardware.
  
-=== SmartLaser ​===+=== Laser ===
  
-This building block is associated with the functional software component of the used laser hardware. Relevant providers are:+This building block is associated with the functional software component of the used laser hardware. Relevant providers ​in this example ​are:
  
-  * Accuracy: Is assumed to be a static value in this example ​and refers to the deviation in cm+  ​* **Accuracy:** Refers to the deviation in mm. Is assumed to be a static value in this example. 
-  * Frequency: The frequency parameter range of the image providing ​task. The maximum value is restricted ​    ​by the hardware. +  ​* **Frequency:** The frequency parameter range of the task providing the laser scan data. The maximum value is restricted by the hardware. 
-  * Dimension: Dimensionality of the provided representation. This is 2D for the laser hardware represented by this building block.+  ​* **Dimension:** Dimensionality of the provided representation. This is 2D for the laser hardware represented by this building block.
  
-=== SmartCamera ​===+=== Camera ​===
  
-This building block is associated with the functional software component of the used camera hardware. Relevant providers are:+This building block is associated with the functional software component of the used camera hardware. Relevant providers ​in this example ​are:
  
-  * Accuracy: Refers to the deviation in cm. Is assumed to be a static value in this example. +  ​* **Accuracy:** Refers to the deviation in mm. Is assumed to be a static value in this example. 
-  * Frequency: The frequency parameter range of the image providing ​task. The maximum value is restricted ​    ​by the hardware. +  ​* **Frequency:** The frequency parameter range of the task providing the image data. The maximum value is restricted by the hardware. 
-  * Dimension: Dimensionality of the provided representation. This is 3D for the camera hardware represented by this building block.+  ​* **Dimension:** Dimensionality of the provided representation. This is 3D for the camera hardware represented by this building block.
  
-=== SmartCameraToLaser ​===+=== CameraToLaser ​===
  
 Maps the camera data to 2D laser data, but keeps some information a 2D laser would not be able to see. It can therefore be treated as a kind of 2.5D representation of the local environment. Maps the camera data to 2D laser data, but keeps some information a 2D laser would not be able to see. It can therefore be treated as a kind of 2.5D representation of the local environment.
  
-=== SmartCdlServer ​===+=== ObstacleAvoidance ​===
 The functional counterpart is responsible for selecting local curvatures to avoid obstacles and to approach the next waypoint on the route to the final goal. It has the Velocity and the Navigation Strategy Variation Points. The two navigation strategy variants are modeled as inner building blocks and are associated with the conflicting properties of execution time and safety distance. We assume that Strategy_B considers an additional safety distance metric next to velocity (tries to approximate the Velocity Parameter as good as possible) and heading (tries to approximate the orientation to the next waypoint as good as possible) when evaluating curvatures. The safety distance metric at time //t// is defined by the minimum lateral distance to the obstacles along the whole curvature at time //t//. This requires additional calculations each time increasing the performance of this execution variant in contrast to Strategy_A that only considers the velocity and heading criteria. Note, that the problem of selecting curvatures regarding different criteria is a conflict of objectives per se, but the availability of two different implementations adds a next level of conflict. The quantification of both properties is not trivial and is thus argued as follows: The functional counterpart is responsible for selecting local curvatures to avoid obstacles and to approach the next waypoint on the route to the final goal. It has the Velocity and the Navigation Strategy Variation Points. The two navigation strategy variants are modeled as inner building blocks and are associated with the conflicting properties of execution time and safety distance. We assume that Strategy_B considers an additional safety distance metric next to velocity (tries to approximate the Velocity Parameter as good as possible) and heading (tries to approximate the orientation to the next waypoint as good as possible) when evaluating curvatures. The safety distance metric at time //t// is defined by the minimum lateral distance to the obstacles along the whole curvature at time //t//. This requires additional calculations each time increasing the performance of this execution variant in contrast to Strategy_A that only considers the velocity and heading criteria. Note, that the problem of selecting curvatures regarding different criteria is a conflict of objectives per se, but the availability of two different implementations adds a next level of conflict. The quantification of both properties is not trivial and is thus argued as follows:
  
-  * Execution Time: Predicting performance values for arbitrary CPUs without executing and measuring it, is a quite complex issue. We therefore assume that the provider of this component uses a precondition together with an absolute value assignment (see AC pattern) to make the description consistent. The precondition refers to the specific CPU type, he was using when measuring the performance of both variants. This precondition may reduce the reusability of the non-functional description drastically at the first sight because potential users with different CPUs can not rely on the given value and must determine it themselves for their own context. However, identical variants of a commercial robot system are delivered with the same CPU, thereby reducing the amount of CPU variability for different users when established,​ popular robot systems are in widespread use.                  +  ​* **Execution Time:** Predicting performance values for arbitrary CPUs without executing and measuring it, is a quite complex issue. We therefore assume that the provider of this component uses a precondition together with an absolute value assignment (see AC pattern) to make the description consistent. The precondition refers to the specific CPU type, he was using when measuring the performance of both variants. This precondition may reduce the reusability of the non-functional description drastically at the first sight because potential users with different CPUs can not rely on the given value and must determine it themselves for their own context. However, identical variants of a commercial robot system are delivered with the same CPU, thereby reducing the amount of CPU variability for different users when established,​ popular robot systems are in widespread use.                  
-  * Safety Distance: The Safety Distance is a quite dynamic metric with a general dependency on the concrete curvature at time //t//. Due to this run-time dependencies on the robot- and environment state it is impossible to determine the value without evaluating the available curvatures at each time //t//. Hence, if we want to decide on the most adequate curvature depending on the absolute curvature criterions for both strategies, we would need to execute both variants and then decide based on the available non-functional curvature information and the current non-functional requirements. However, this would annul our general Execution Time-Safety Distance conflict. For that reason and because of complexity reasons, the Safety Distance is described only in a relative from (RM pattern). We directly use the internal weighting for the safety distance metric as the representative relative value of the strategy. Since Strategy A is considering Safety Distance not at all, the value is 0.0. For Strategy B we assume that the objective function uses equal weighting for all 3 criteria (Heading = 0.33, Velocity = 0.33, Safety Distance = 0.33). Hence safety distance for Strategy B is represented by 0.33.+  ​* **Safety Distance:** The Safety Distance is a quite dynamic metric with a general dependency on the concrete curvature at time //t//. Due to this run-time dependencies on the robot- and environment state it is impossible to determine the value without evaluating the available curvatures at each time //t//. Hence, if we want to decide on the most adequate curvature depending on the absolute curvature criterions for both strategies, we would need to execute both variants and then decide based on the available non-functional curvature information and the current non-functional requirements. However, this would annul our general Execution Time-Safety Distance conflict. For that reason and because of complexity reasons, the Safety Distance is described only in a relative from (RM pattern). We directly use the internal weighting for the safety distance metric as the representative relative value of the strategy. Since Strategy A is considering Safety Distance not at all, the value is 0.0. For Strategy B we assume that the objective function uses equal weighting for all 3 criteria (Heading = 0.33, Velocity = 0.33, Safety Distance = 0.33). Hence safety distance for Strategy B is represented by 0.33.
  
 The local conflict here is the chance for more safety distance at the expense of a higher execution time. Since the local execution time here is not very meaningful, the conflict is modeled at a higher level of abstraction (see later section about GOTO building block). The local conflict here is the chance for more safety distance at the expense of a higher execution time. Since the local execution time here is not very meaningful, the conflict is modeled at a higher level of abstraction (see later section about GOTO building block).
Line 75: Line 77:
 === SensorVariant === === SensorVariant ===
  
-The mentioned discrete Sensor variation point is modeled by a Fork Building Block. The two alternative execution paths are (i) SmartLaser ​and (ii) SmartCamera ​SmartCameraToLaser ​before they merge in the SmartCdlServer. The special CA (Collect-Aggregate) entities of a fork building block allow us to collect alternative results for a specific property that are modeled in separate building blocks and aggregate them to a single output that provides a variability vector that can be traced back to the different variants. This way, we can summarize all relevant NFPs affected by this Variation Point and provide conflicting values. In this example and for that Variation Point, we expect multiple values regarding the following NFPs:  +The mentioned discrete Sensor variation point is modeled by a Fork Building Block. The two alternative execution paths are (i) Laser and (ii) Camera ​CameraToLaser ​before they merge in the ObstacleAvoidance. The special CA (Collect-Aggregate) entities of a fork building block allow us to collect alternative results for a specific property that are modeled in separate building blocks and aggregate them to a single output that provides a variability vector that can be traced back to the different variants. This way, we can summarize all relevant NFPs affected by this Variation Point and provide conflicting values. In this example and for that Variation Point, we expect multiple values regarding the following NFPs:  
-  * Accuracy: Different values depending on the sensor type and the concrete hardware. Lasers are   typically more accurate than cameras (when neglecting special environment context dependencies such       as fog, illumination level etc.). Note that such aspects could be added as well to provide more detailed relations. But for simplicity, we assume static values here that are provided by SmartLaser ​and SmartCamera ​as mentioned earlier. +  ​* **Accuracy:** Different values depending on the sensor type and the concrete hardware. Lasers are   typically more accurate than cameras (when neglecting special environment context dependencies such       as fog, illumination level etc.). Note that such aspects could be added as well to provide more detailed relations. But for simplicity, we assume static values here that are provided by the Laser and Camera component ​as mentioned earlier. 
-  * Performance (Response Time and CPU Utilization):​ We have to compare the local performance of the variants, i.e. the performance from capturing sensor data by the respective hardware to the arrival of the data in the SmartCdlServer. The CA entity of this building block only collects and aggregates the different values from separate building blocks that address the performance related composition analysis (see next section). In general, a higher response time of a variant will lead to a higher CPU Utilization when using the same allocation configuration in a continuous sensor-actuator loop+  * **Worst Case Performance (Response Time and CPU Utilization):​** We have to compare the local performance of the variants, i.e. the performance from capturing sensor data by the respective hardware to the arrival of the data in the ObstacleAvoidance. The CA entity of this building block only collects and aggregates the different values from separate building blocks that address the performance related composition analysis (see next section). 
-  * Visibility: This property refers to the fact that a 2D laser is not able to see every kinds of objects in the environment because it uses only a flat beam at a certain height where the device is mounted to scan the environment. We assume that the obstacles are mapped to a 2D grid map. If we know the actual height for each obstacle cell, a 2D laser can not see those cells whose height is lower than the height where the sensor device is mounted. The Visibility metric therefore calculates the number of visible cells of the robots current local environment depending on the sensor dimensionality and the height list for each obstacle cell that is assumed to be known in advance. Since this metric is associated with the occupancy grid map and in our system the SmartCdlServer ​is the component who provides this functionality,​ one could argue that the SmartCdlServer ​should be the provider of this NFP. However, in this example we decided to model this NFP and its composition function here because we assume it is the main reason why this variation point was added in the system by the system builder.+  ​* **Visibility:​** This property refers to the fact that a 2D laser is not able to see every kinds of objects in the environment because it uses only a flat beam at a certain height where the device is mounted to scan the environment. We assume that the obstacles are mapped to a 2D grid map. If we know the actual height for each obstacle cell, a 2D laser can not see those cells whose height is lower than the height where the sensor device is mounted. The Visibility metric therefore calculates the number of visible cells of the robots current local environment depending on the sensor dimensionality and the height list for each obstacle cell that is assumed to be known in advance. Since this metric is associated with the occupancy grid map and in our system the ObstacleAvoidance ​is the component who provides this functionality,​ one could argue that the ObstacleAvoidance ​should be the provider of this NFP. However, in this example we decided to model this NFP and its composition function here because we assume it is the main reason why this variation point was added in the system by the system builder.
  
 === Performance_Laser/​Performance_Camera === === Performance_Laser/​Performance_Camera ===
  
-These building blocks actually envelope a separate activity for a compositional performance analysis that is of big complexity. In general it is the provider of the end-to-end performance (response time + cpu utilization) of the sensor-actuator loop for the laser and the camera ​variant respectively. The dependencies are among others the execution times of the involved tasks of the components along the sensor-actuator loop, and their individual activation sources that are also Variation Points (e.g. the frequency of a periodic timer is a Parameter that can be configured within a specific scope, see SmartCamera ​and SmartLaser). All the dependencies can have an impact to the resulting performance values and the composition function is represented by a SymTA/S analysis. For more information see https://​robmosys.eu/​wiki/​baseline:​environment_tools:​smartsoft:​smartmdsd-toolchain:​cause-effect-chain:​start?​s[]=symta. At the moment, all those relevant aspects are explicated in separate meta-models and the concrete models must be specified manually using an own DSL. Hence, this building block represents a placeholder at the moment. We need domain-specific extensions in the future to generate models for performance analysis from all the relevant dependencies and to execute them automatically as part of the global NFCC resolution process. Nevertheless,​ we can again explicate absolute values here and support ​them with a consistent precondition. For example: A user of this system with the same CPU can rely on the values when using a specific configuration setting for which the values were determined and must not perform a separate analysis. Note, that the output ports of both properties (response time/cpu utilization) will provide vectors (two values) because they depend on the execution times of the two available navigation strategies in the SmartCdlServer.+These building blocks actually envelope a separate activity for a compositional performance analysis that is of big complexity. In general it is the provider of the end-to-end performance (worst case response time + cpu utilization) of the sensor-actuator loop for the Laser and the Camera ​variant respectively. The dependencies are among others the execution times of the involved tasks of the components along the sensor-actuator loop, and their individual activation sources that are also Variation Points (e.g. the frequency of a periodic timer is a Parameter that can be configured within a specific scope, see Camera ​and Laser). All the dependencies can have an impact to the resulting performance values and the composition function is represented by a SymTA/S analysis. For more information see https://​robmosys.eu/​wiki/​baseline:​environment_tools:​smartsoft:​smartmdsd-toolchain:​cause-effect-chain:​start?​s[]=symta. At the moment, all those relevant aspects are explicated in separate meta-models and the concrete models must be specified manually using an own DSL. Hence, this building block represents a placeholder at the moment. We need domain-specific extensions in the future to generate models for performance analysis from all the relevant dependencies and to execute them automatically as part of the global NFCC resolution process. Nevertheless,​ we can again explicate absolute values here and make them consistent ​with an appropriate ​precondition. For example: A user of this system with the same CPU can rely on the values when using a specific configuration setting for which the values were determined and must not perform a separate analysis. Note, that the output ports of both properties (response time/cpu utilization) will provide vectors (two values) because they depend on the execution times of the two available navigation strategies in the ObstacleAvoidance.
  
 === CPU Allocation === === CPU Allocation ===
Line 90: Line 92:
 === Context Providers === === Context Providers ===
  
-This container ​is responsible to provide relevant context data for the NFCC or at least to link to the providing source. The following context data are relevant for our example:+This building block is responsible to provide relevant context data for the NFCC or at least to link to the providing source. The following context data are relevant for our example:
    
-  * BatteryLevel:​ The current battery level of the robot in percent. This information is provided by a service of the mobile base. At the moment this information is not used.   +  ​* **BatteryLevel:​** The current battery level of the robot in percent. This information is provided by a service of the mobile base. This information is available in the example but not used.   
-  * TransportsFilledCup:​ A flag that indicates if the robot is currently transporting a filled cup when this system/​skill is executed. This information can be retrieved from the knowledgebase of the robot. +  ​* **TransportsFilledCup:​** A flag that indicates if the robot is currently transporting a filled cup when this system/​skill is executed. This information can be retrieved from the knowledgebase of the robot. 
-  * LaserMountHeight:​ The height where the laser is mounted on the robot +  ​* **LaserMountHeight:​** The height where the laser is mounted on the robot 
-  * CellHeightList:​ The height list for each static obstacle cell in the current local environment of the robot. We assume that the local environment is defined by distinct semantic locations on the overall map such as region-A, region-B etc.  ​+  ​* **CellHeightList:​** The height list for each static obstacle cell in the current local environment of the robot. We assume that the local environment is defined by distinct semantic locations on the overall map such as region-A, region-B etc.  ​
  
-Domain-specific context information and their properties needs to be investigated ​in more detail to explicate ​and standardize them as general terms with a unique meaning ​within an ecosystem. For example, there may be standardized definitions and models that describe all relevant information about the assembly of the robot. From that we can finally retrieve the mentioned laser mount height.+Domain-specific context information and their properties needs to be investigated and can then be standardized ​within an ecosystem. For example, there may be standardized definitions and models that describe all relevant information about the assembly of the robot. From that we can finally retrieve the mentioned laser mount height.
  
 === GOTO === === GOTO ===
Line 103: Line 105:
 This building block represents the overall functionality of driving from one start location to a goal location at the process level. It is a more abstract building block because it is a composite realization of the functional components. It follows, that the non-functional aspects modeled in this non-functional building block container are equally more abstract. We consider the following non-functional poperties: This building block represents the overall functionality of driving from one start location to a goal location at the process level. It is a more abstract building block because it is a composite realization of the functional components. It follows, that the non-functional aspects modeled in this non-functional building block container are equally more abstract. We consider the following non-functional poperties:
  
-  * Time: Refers to the time needed to reach the goal location. We use RM and define a linear relation to Velocity. The higher the velocity, the higher the utility with respect to time.  +  ​* **Time:** Refers to the time needed to reach the goal location. We use RM and define a linear relation to Velocity. The higher the velocity, the higher the utility with respect to time.  
-  * Maximum Braking Distance: The maximum distance a robot still travels up to standstill after it decided to stop. A safety concern that may lead to undesired collisions in dynamic environments. This property can be composed ​absolutely ​using the given formula and depends on the Velocity, the Response Time of the sensor-actuator loop and the acceleration deceleration of the mobile base. +  ​* **Maximum Braking Distance:** The maximum distance a robot still travels up to standstill after it decided to stop. A safety concern that may lead to undesired collisions in dynamic environments. This property can be composed ​absolute ​using the given formula and depends on the Velocity, the Response Time of the sensor-actuator loop and the acceleration deceleration of the mobile base. 
-  * Spill Risk: The higher the velocity, the higher the spill risk (the lower the utility w.r.t Spill Risk) but of course only if the robot transports a filled cup. +  ​* **Spill Risk:** The higher the velocity, the higher the spill risk (the lower the utility w.r.t Spill Risk) but of course only if the robot transports a filled cup. 
-  * Energy (MotorEnergy,​ CPUUtilization):​ We modeled two relevant dependencies:​ (i) energy consumption by the motors and (ii) energy consumption by CPU utilization. For the former we use RM(Velocity) and for the latter we have the absolute values. But due to the former relative dependency (modeling energy consumption ​absolutely ​is hard in general but especially for free navigation),​ the superordinate Energy property can not be modeled ​absolutely. Hence, we use the relative WS pattern and adjust the weights to declare the dominating factor (MotorEnergy).+  ​* **Energy (MotorEnergy,​ CPUUtilization):​** We modeled two relevant dependencies:​ (i) energy consumption by the motors and (ii) energy consumption by CPU utilization. For the former we use RM(Velocity) and for the latter we have the absolute values. But due to the former relative dependency (modeling energy consumption ​absolute ​is hard in general but especially for this navigation ​architecture), the superordinate Energy property can not be modeled ​absolute. Hence, we use the relative WS pattern and adjust the weights to declare the dominating factor (MotorEnergy).
  
-Finally, we model our Conflicts in this container. We use the more abstract normalized braking distance value in each conflict instead of its lower-level dependencies (Velocity and Performance),​ the ones that are actually affected directly by the Variation Points. But since braking distance is composed ​absolutely, its normalized value is able to reflect the partial influences of the Variation Points via the lower-level dependencies. This is also the reason why we model the conflict for the Navigation Strategy Variation Point explicitly here: In SmartCdlServer ​the SafetyDistance/​ExecutionTime-conflict is only local and the latter refers to the execution time of the individual strategies - a value that is actually meaningless. However, the value is gaining importance as a partial influence factor to a more abstract property of a more abstract functionality (BrakingDistance/​GOTO). In this sense, the local conflict is taken up from the SmartCdlServer ​and then mapped to a higher level of abstraction.+Finally, we model our Conflicts in this container. We use the more abstract normalized braking distance value in each conflict instead of its lower-level dependencies (Velocity and Performance),​ the ones that are actually affected directly by the Variation Points. But since braking distance is composed ​absolute, its normalized value is able to reflect the partial influences of the Variation Points via the lower-level dependencies. This is also the reason why we model the conflict for the Navigation Strategy Variation Point explicitly here: In the ObstacleAvoidance component ​the SafetyDistance/​ExecutionTime-conflict is only local and the latter refers to the execution time of the individual strategies - a value that is actually meaningless. However, the value is gaining importance as a partial influence factor to a more abstract property of a more abstract functionality (BrakingDistance/​GOTO). In this sense, the local conflict is taken up from the ObstacleAvoidance component ​and then mapped to a higher level of abstraction.
  
 === Requirement Specification === === Requirement Specification ===
nfp-modeling/start.1582739828.txt.gz · Last modified: 2020/02/26 18:57 by Timo Blender