Aim :- Project on Vehicle Direction Detection
Objective :-
- The objective of this project is to develop a MBD compliant MATLAB Simulink model for vehicle direction detection as per the given requirement.
- Tag the requirements to the Simulink model, Requirements 1 & Requirement 2 are tagged into their corresponding subsystems.
- Make necessary MBD complilant changes.
- Creating of Simulink Data Dictionary (SLDD) for the given model.
- Choose sample time for all signals as 0.01 sec.
- Creation of MAB Guidline report for given vehicle direction detection model.
- Creation of test harness to Perform Model In Loop (MIL) test.
Theory on ADAS (Advanced Driver Assistance System :-
Almost all vehicle accidents are caused by human error, which can be avoided with Advanced Driver Assistance Systems (ADAS). The role of ADAS is to prevent deaths and injuries by reducing the number of car accidents and the serious impact of those that cannot be avoided.
- Pedestrian detection/avoidance
- Lane departure warning/correction
- Traffic sign recognition
- Automatic emergency braking
- Blind spot detection
- Essential safety-critical ADAS applications include:
These lifesaving systems are key to ensuring the success of ADAS applications, incorporating the latest interface standards and running multiple vision-based algorithms to support real-time multimedia, vision co-processing, and sensor fusion subsystem.
Working Principle of ADAS :-
- Automobiles are the foundation of the next generation of mobile-connected devices, with rapid advances being made in autonomous vehicles. Autonomous application solutions are partitioned into various chips, called System on chip. These chips connect sensors to actuators through interfaces and high-performance ECUs (electronic controller units).
- Self-driving cars use a variety of these applications and technologies to gain 360-degree vision, both near (in the vehicle’s immediate vicinity) and far. That means hardware designs are using more advanced process nodes to meet ever-higher performance targets while simultaneously reducing demands on power and footprint.
Some of the most common ADAS applications are as under:-
1. Adaptive Cruise Control
Adaptive cruise control (ACC) is particularly helpful on the highway, where drivers can find it difficult to monitor their speed and other cars over a long period of time. Advanced cruise control can automatically accelerate, slow down, and at times stop the vehicle, depending on the actions other objects in the immediate area.
2. Glare-Free High Beam and Pixel Light
Glare-free high beam and pixel light uses sensors to adjust to darkness and the vehicle’s surroundings without disturbing oncoming traffic. This new headlight application detects the lights of other vehicles and redirects the vehicle’s lights away to prevent other road users from being temporarily blinded.
3. Adaptive Light Control
Adaptive light control adapts the vehicle’s headlights to external lighting conditions. It changes the strength, direction, and rotation of the headlights depending on the vehicle’s environment and darkness.
4. Automatic Parking
Automatic parking helps inform drivers of blind spots so they know when to turn the steering wheel and stop. Vehicles equipped with rearview cameras have a better view of their surroundings than traditional side mirrors. Some systems can even complete parking automatically without the driver’s help by combining the input of multiple sensors.
5. Autonomous Valet Parking
Autonomous valet parking is a new technology that works via vehicle sensor meshing, 5G network communication, with cloud services that manage autonomous vehicles in parking areas. The vehicles sensors provide the vehicle with information about where it is, where it needs to go, and how to get there safely. All this information is methodically evaluated and used to perform drive acceleration, braking, and steering until the vehicle is safely parked.
6. Navigation System
Car navigation systems provide on-screen instructions and voice prompts to help drivers follow a route while concentrating on the road. Some navigation systems can display exact traffic data and, if necessary, plan a new route to avoid traffic jams. Advanced systems may even offer Heads Up Displays (HuD) to reduce driver distraction.
Vehicle Direction Detection can be done alongside the GPS system in order to optimize the range direction.
fig. 1. Level of Automation
Importance of Vehicle Direction Determination :-
-
Identifying the direction of the vehicle is one of the important & diverse features in Autonomous driving & Advanced Driver Assistance Features. This particular sub-feature of identifying the direction of vehicle is basically identifying the direction the vehicle is taking based on the camera input.
-
Camera reads the road signs & stores into its memory with unique values for left turn, right turn & straight drive. Depending on the direction it is taking, final indication is given to the driver – as an indication if he is driving in the recommended direction or not.
-
Vehicle Direction Determination can also be coupled along - side features like GPS systems to identify whether the vehicle is reaching its destination in an optimized manner. This sub feature can also be used along with Lane Detection, Highway Warning, Ramp Entry / Exit in Wrong Way Detection etc.
Development of Vehicle Direction Determination model is as under;
(1) Main Model :-

fig.2 Vehicle_Direction
- The above model is the main model of the Vehicle Direction Determination logic. Here, as per given conditon, input signals to this system are SteeringWheel_YawDegreeInput and CameraInput_RoadSign. Since, both signals are originating/used first time, hence we have to resolve this signals by right clicking on signal properties and check the first box as under;

- As per given conditon /logic, these signals are used to confirm the occurrence of a road sign.
- A camera used in this case, will reads the road signs & stores into its memory unique values for Right Turn, Left Turn & Straight Drive. Depending on the direction the vehicle takes, a final indication is given to the driver – if he is driving in the recommended direction or not.
- The output (Signal Name: Vehicle_Direction_Indicator) of the sytem is a boolean value (0 or 1) which specifies whether the vehicle is turning right, left or going straight.
As the subsystem 'Vehicle_Direction' is the main subsystem, hence it is further divided into two more subsystems as this subystems needs to follow two given requirements i.e. Requirement1 & Requirement2
Inside main subsystem 'Vehicle_Direction' is shown as under:

Figure 3:- 'Requirement1' and 'Requirement2' subsystems
- In this, first input signal name i.e. SteeringWheel_YawDegreeInput is applied to Requirement1 subsystem.
- In this Requirement1 subsystem, the steering wheel yaw degree (Signal name: SteeringWheel_YawDegreeInput) is given as input to the subsystem and this input value is compared against 3 angular values, one each for left turn, right turn & straight drive (Calibration Values: Right_Turn_AngularLimit, Left_Turn_AngularLimit, Straight_Drive_Steering_Angle) to tell the steering wheel is turning towards which direction.
- Accordingly, output signal name i.e. Vehicle_Turn_Status goes out from the Requirement1 subsystem to inform the vehicle turning status.
- This local signal from Requirement1 block i.e. Vehicle_Turn_Status and input signal i.e. CameraInput_RoadSign is fed into Requirement2 subsystem. In this Requirement2 subsytem, this output provided by Requirement1 block i.e. Vehicle_Turn_Status is compared (equal to == operator) with actual Road sign values and if it matches, it will give logical 1 or true value. This values are compared in AND operation with input signal i.e.CameraInput_RoadSign which states that when camera actually captures the road sign, then it is compared with caliberation value of road sign values (i.e. RightTurn_RoadSign,LeftTurn_RoadSign and Straight_RoadSign) and when it matches, then only output signal (i.e. Vehicle_Direction_Indicator) is provided in the form of boolean values.
- The output 'Vehicle_Direction_Indicator' will be a three dimension signal having three boolean elements, with the first element to indicate 'Right Turn', second element to indicate 'Left Turn' and third element to indicate 'Straight Drive'.
Requirement1 subsytems :-
- In this Requirement1 subsystem, the steering wheel yaw degree (Signal name: SteeringWheel_YawDegreeInput) is given as input to the subsystem and this input value is compared against 3 angular values, one each for left turn, right turn & straight drive (Calibration Values: Right_Turn_AngularLimit, Left_Turn_AngularLimit, Straight_Drive_Steering_Angle) to tell the steering wheel is turning towards which direction.

Figure 4 :- 'Requirement1' subsystem
- In this case, the output of the relational operator '==' is equal to 1 only when the 'SteeringWheel_YawDegreeInput' matches with any one of caliberation value(i.e. Right_Turn_AngularLimit, Left_Turn_AngularLimit, or Straight_Drive_Steering_Angle.) then it will give logical 1 or true condition.
-
Three signal switch blocks are used to implement an if-else logic. The output from each relational operator(==) is given to second port (control port) of each switch block. The switch will pass input1 (calibration value) when the condition (input2>0) condition is satisfied, else input3 (output from the next switch block) is passed as output.
-
If 'SteeringWheel_YawDegreeInput' does not matches with any one of caliberation value(i.e. Right_Turn_AngularLimit, Left_Turn_AngularLimit, or Straight_Drive_Steering_Angle.) then it will give logical 0 or false condition. Then, it will take the value mentioned in the third port of switch 3. In this case, at the third port, value mentioned is 0 hence, the same is given out as output i.e Vehicle_Turn_Status.
Requirement2 subsytem:-
- In 'Requirement2' subsystem, the local signal 'Vehicle_Turn_Status' is fed from 'Requirement1' subsystem and given as an input and is compared against calibration values (RightTurn_RoadSign, LeftTurn_RoadSign, Straight_RoadSign) using the relational operator '=='.
- If the value matches, then the relational operator (==) gives logical 1 or true value and same is compared in AND operation with input signal from camera i.e.CameraInput_RoadSign which states that when camera actually captures the road sign, then it is compared with caliberation value of road sign values (i.e. RightTurn_RoadSign,LeftTurn_RoadSign and Straight_RoadSign) and when it matches, then only output signal (i.e. Vehicle_Direction_Indicator) is provided in the form of boolean values.
- The output 'Vehicle_Direction_Indicator' will be a three dimension signal having three boolean elements, with the first element to indicate 'Right Turn', second element to indicate 'Left Turn' and third element to indicate 'Straight Drive'.
- Since, signals such as Vehicle_Turn_Status is already being used in this model hence we proporgate the same signal by clicking on signal and entered '<>' .

Figure 5: 'Requirement2' subsystem
However, output signal (i.e Vehicle_Direction_Indicator) is form newly in this subsystem hence we resolve this signal by right clicking on signal properties,

Then, we have changed the configuration parameters by firstly, changed the solver settings by clicking on 'Model Settings'. After opening the 'Model Settings', converting the type as fixed step and solver as discrete (no continous)and changed the sample time as 0.01 sec.

Then, we have changed the System Target File to 'ert.tlc' under code generation section as under;

Creation of Simulink Data Dictonary :-
We have to go under Modelling section and select on 'Link to Data Dictionary' as under;

Step 2 :-
Create a new data directory in the given folder and named it by giving .sldd extension.

Step 3 :-
We have given the input signal, output signal and Caliberation values/ constant and the same are as under;
Signal / Calibration Name
|
Signal Type
|
Data Type
|
Dimension
|
Min
|
Max
|
Initial Value
|
Units
|
SteeringWheel_YawDegreeInput
|
Input
|
Int16
|
1
|
-180
|
180
|
-
|
Deg
|
CameraInput_RoadSign
|
Input
|
Boolean
|
1
|
0
|
1
|
-
|
-
|
Vehicle_Turn_Status
|
Local
|
Int16
|
1
|
-180
|
180
|
-
|
Deg
|
Right_Turn_AngularLimit
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
30
|
Deg
|
Left_Turn_AngularLimit
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
-120
|
Deg
|
Straight_Drive_Steering_Angle
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
0
|
Deg
|
RightTurn_RoadSign
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
30
|
|
LeftTurn_RoadSign
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
-120
|
|
Straight_RoadSign
|
Calibration
|
Int16
|
[1 1]
|
-180
|
180
|
0
|
|
Vehicle_Direction_Indicator
|
Output
|
Boolean
|
3
|
0
|
1
|
-
|
-
|
Accordingly, we have created the Simulink Data Dictionary (SLDD) by clicking on file name i.e vehicle_detection_dd.sldd and select Design data. Then, we have to select add signal and add simulink parameter depending on wheter the values are input signal, output signal or constant/caliberation values.

- In this, we have chose the Storage class for Input signals as ImportedExtern; Storage class for Output signal as Export to File; Storage class for local signals as localizable; Storage class for calibration signals as Const.
- Sample time for all signals are taken as 0.01 sec as per given condition.
- While declaring output signal and constant values, we have given header file name as 'detection_head.h' and definition file name as 'detection_def.c'. The same is as under;

Requirement Tagging of Simulink Model:-
- The requirements ‘Requirement-1’ and ‘Requirement-2’ as provided in the Requirements Word document are tagged to their corresponding subsystem in the Simulink Model.
Steps for tagging a requirement in Simulink model :-
- First, we have highlight/ select the requirement statement in the Word document file. This word document file needs to be same MATLAB path.
- Then, we have to go in Simulink Model and right click on particular subsystem, we need to tag. After right clicking, we have to select 'Requirements' and then click on 'link to Selection in word'. Then, the particular subsytem will get tagged for the highlighted text on word documents file.
- After doing sucessful tagging, MATLAB traceability file is generated in the same MATLAB path.
We have considered above steps while tagging the Requirement1 and Requirement2 and the same are as under;
Tagging for Requirement1 :-


Tagging for Requirement2 :-



Steps for Checking MAB Guidelines :-
Step 1:-
We have to go into 'Modelling option' and then select 'Model Advisor'

Step 2:-
We have to choose the system for checking guidelines. In this case, we are checking full system check as under as per MAB guidelines;

Step 3 :-
Therafter, below screen appears. In this, we have to select as per Modelling standards for MAB guidelines.
Then, we have to click on 'Run Selected Check' option as under;

After running the model as per MAB Guidelines, we observed that it is sucessfully passed with value of 125 and total 19 nos. of warnings. However, there is no fail condition. Hence, we can conclude that above model is technically/logically correct.

After successful running of model as per MAB guidelines, we observed that guideline report is generated in the same Matlab folder path. (The same is attached)
Now, we will try to create Model in Loop Harness (MIL) :-
- A harness for Model In Loop test is created by right clicking on the main ‘Vehicle_Direction’ subsystem, then selecting ‘Test Harness’ and then ‘Create for Vehicle_Direction and accordingly name is given in the same Matlab folder.
- While creating the test harness, source is taken as Signal Builder block and Sinks/Output is fed to Workspace directly.

Then, the basic Model in Loop test harness is created as under;

However, we have created test data as an input in the form of excel sheet and we have given the same as Harness input as under;
Input test data :-

We have imported this test data signal i.e. 'SteeringWheel_YawDegreeInput' and ‘CameraInput_RoadSign’ into Signal builder block as under;


Accordingly, we have sucessfully imported the test data signals as under;

fig. Signal Builder block test signals for 'SteeringWheel_YawDegreeInput' and ‘CameraInput_RoadSign’
Thereafter, final Model in loop (MIL) test harness as under;

Output Results :-


- Test is performed for the developed model using Model In Loop (MIL). The test is carried out for 5 seconds duration with step time for all signals set as 0.01sec.
- The simulation is run for 5 seconds using the MIL test harness and then results for output signal ‘Vehicle_Direction_Indicator’ is copied from the workspace into an excel sheet ‘Test_Report_Data’ to check whether the results match the expected output.
- The input signal ‘CameraInput_RoadSign’ value is set as ‘1’ between 0 and 3 seconds to confirm the occurrence of a road sign then for 3 to 4 sec ‘CameraInput_RoadSign’ value is set as ‘1’. Thereafter, again, we have changed 'CameraInput_RoadSign’ value is set as ‘1’ between 4 to 5 sec.
The input signal ‘SteeringWheel_YawDegreeInput’ is value is set as:
- ‘30’ between 0 to 1 seconds which matches with the value of the calibration parameter ‘Right_Turn_AngularLimit’ in ‘Requirement1’ subsystem and parameter ‘RightTurn_RoadSign’ in ‘Requirement2’ subsystem. The output signal ‘Vehicle_Direction_Indicator’ whose dimension is ‘3’ will have a boolean value [1 0 0] to indicate Right Turn between 0 and 1 second.
- ‘-120’ between 1 to 2 seconds which matches with the value of the calibration parameter ‘Left_Turn_AngularLimit’ in ‘Requirement1’ subsystem and parameter ‘LeftTurn_RoadSign’ in ‘Requirement2’ subsystem. The output signal ‘Vehicle_Direction_Indicator’ whose dimension is ‘3’ will have a boolean value [0 1 0] to indicate Left Turn between 1 and 2 second.
- ‘0’ between 2 to 3 seconds which matches with the value of the calibration parameter ‘Straight_Drive_Steering_Angle’ in ‘Requirement1’ subsystem and parameter ‘Straight_RoadSign’ in ‘Requirement2’ subsystem. The output signal ‘Vehicle_Direction_Indicator’ whose dimension is ‘3’ will have a boolean value [0 0 1] to indicate Straight Drive between 2 and 3 second.
- ‘30’ between 3 to 4 seconds which matches with the value of the calibration parameter ‘Right_Turn_AngularLimit’ in ‘Requirement1’ subsystem. However, it does not matches with any parameter caliberation value in ‘Requirement2’ subsystem. The output signal ‘Vehicle_Direction_Indicator’ whose dimension is ‘3’ will have a boolean value [0 0 0] to indicate neither of this turning operation is taking placed.
- ‘-120’ between 4 to 5 seconds which matches with the value of the calibration parameter ‘Left_Turn_AngularLimit’ in ‘Requirement1’ subsystem and parameter ‘LeftTurn_RoadSign’ in ‘Requirement2’ subsystem. The output signal ‘Vehicle_Direction_Indicator’ whose dimension is ‘3’ will have a boolean value [0 1 0] to indicate Left Turn between 4 and 5 second.
Conclusion :-
We have sucessfully model the Vehilcle Direction Determination and also, we able to make the test harnest MIL (model in loop). Further, we also used tagging the subystems and creation of SLDD file.