Rant: What is Self-Driving Vehicles and my views

We all like learning about new technology and we hear multiple companies working on Self-Driving vehicles. What we never hear about is how it actually works and the process of using it. In this post, we will explain a little about the technology that is being used and how the car interacts with other users or objects on the road. If you haven’t already seen a site called Moral Machine developed by MIT, then I do suggest you check it out. I believe its methods are really drastic because it most of the 13 scenarios that are presented you have to take a life. The whole reason for Sel-Driving vehicles is to prevent accidents and make driving safe for everyone. If you think my judgment is wrong, please let me know in the comments.

How does a Self-Driving vehicle work?
The vehicle will need some of the very old technology that has, of course, increased in development. Let’s go over what is needed for a Self-Driving vehicle to function correctly.

CPU (Central Processing Unit) is on the top of our list because this is the closest humans can get to a functioning brain.

Proximity Sensors have been around since 1970.
The proximity sensor (a transmitter and sensor pair) work acoustically. A pair is fitted on the backside of the car. The transmitter generates a high-frequency sound signal and the sensor measures the time difference of the signal bounced back from the wall. The time difference reduces as the car approaches the wall, telling the driver when to stop. This sounds a lot like sonar/radio waves, but we will get into that very soon.

Road Maps have been around since 1160 BC, but didn’t start getting used in vehicles until 1981 (Honda Accord/Vigor). The maps we used to use today come from two Danish brothers Lars and Jens Eilstrup Rasmussen back in 2003, and later was acquired by Google and became Google Maps.
Maps tell you the directions to the destination of your choosing. We will go over the pitfalls of this later in the post.

GPS (Global Positioning System) Navigation is also required, but this technology was first introduced in 1993 (Mazda Cosmo).
GPS is made up of satellites, ground stations, and receivers. The receiver sends a signal to the satellite or ground station depending on the weather and where in the world you are. The receiver will then send out multiple signals to triangle your position. Once you’re the location has been identified, it will simultaneously bring that location on the map.

Sonar Grid or radar was developed by James Clerk Maxwell and Heinrich Hertz back in 1888.
Sonar works with the CPU, multiple proximity sensors, and the camera. Using the functions that have already been explained about it gathers all of this information and guides the vehicle around the objects or to the object.

Front, Reverse, and Side Camera technology was introduced by Toyota (1991 Toyota Soarer Sports Coupe) in 1991 and has since been developed.
This works like your normal camera on any device that you decide to use, but for a safety feature, it has also made items appear closer than they appear, just like your side mirrors. With the other functions listed above to add a better experience.

SAE automated vehicle classifications:
Level 0: Automated system issues warnings but has no vehicle control.
Level 1 (”hands-on”): Driver and automated system share control over the vehicle. An example would be Adaptive Cruise Control (ACC) where the driver controls steering and the automated system controls speed. Using Parking Assistance, steering is automated while speed is manual. The driver must be ready to retake full control at any time. Lane Keeping Assistance (LKA) Type II is a further example of level 1 self-driving.
Level 2 (”hands-off”): The automated system takes full control of the vehicle (accelerating, braking, and steering). The driver must monitor the driving and be prepared to immediately intervene at any time if the automated system fails to respond properly. The shorthand ”hands off” is not meant to be taken literally. In fact, contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene.
Level 3 (”eyes off”): The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a movie. The vehicle will handle situations that call for an immediate response, like emergency braking. The driver must still be prepared to intervene within some limited time, specified by the manufacturer when called upon by the vehicle to do so.
Level 4 (”mind off”): As level 3, but no driver attention is ever required for safety, i.e. the driver may safely go to sleep or leave the driver’s seat. Self-driving is supported only in limited areas (geofenced) or under special circumstances, like traffic jams. Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, i.e. park the car, if the driver does not retake control.
Level 5 (”wheel optional”): No human intervention is required. An example would be a robotic taxi.

Who can drive a Self-Driving vehicle?
It is not as simple as turning the key or using voice activation to tell the car to drive to your destination. To get behind the wheel of a Self-Driving vehicle you first need to be evaluated on your current driving habits. This includes how well you can do that Self-Driving vehicles job without making mistakes. If you pass the test, you will still have to go to a Self-Driving 40-hour class that will teach you how to operate the semi-automated car. The reason I say semi-automated is because the car can’t function without the use of human interaction. It will follow a set path and if the multiple sensors can read the road. It will be automated, but considering the roads are not all at the same standard, and obstacles in the way come and go.

Maybe you already had a good understanding of Self-Driving Vehicles, but the information above should help you with understanding it more.


Meltdown and Spectre vulnerabilities — PowerShell Script

Microsoft has released a PowerShell module that lets the average user check if the chip vulnerabilities are enabled on your system or not. Microsoft has already known about this issue since June 2017, but has only started to release the update for Windows OS. To make sure that you pass the checks, you will need to have an updated Windows OS with the January 2018 Security updates, and the BIOS/Firmware update for your PC.

I thought it would be best to provide the average computer user a quick way to test the system and therefore I have created this simple script. This script first turns Admin mode on, but that might require you to confirm the User Account Control (UAC) window. Then it will make sure ExecutionPolicy is set to RemoteSigned. After this, the script will check to see if PSRepository called PSGallery is trusted. Once all of that is done the script will run SpeculationControlSettings and give you an output like this.

You can find the script explained below or download it from my GitHub page.

The image above is full of read and False checks, but take a good look at the suggested actions. Installing/updating BIOS/Firmware and the January 2018 Security Update will fix the False readings to true. I am unable to install the BIOS without Company Admin password, but I have installed the January 2018 Security Update. You can see that image below.


### Checks if Administration mode is on ###
Function Test_Admin {
$currentUser = New-Object Security.Principal.WindowsPrincipal $([Security.Principal.WindowsIdentity]::GetCurrent())
if ((Test_Admin) -eq $false) {
if ($elevated)
# tried to elevate, did not work, aborting
else {
Start-Process powershell.exe -Verb RunAs -ArgumentList ('-noprofile -noexit -file "{0}" -elevated' -f ($myinvocation.MyCommand.Definition))


### Runs Meltdown_Spectre Script ###
Function Meltdown_Spectre {
Set-ExecutionPolicy RemoteSigned -Scope Currentuser
Set-PSRepository -Name PSGallery -InstallationPolicy Trusted
Install-Module -Name SpeculationControl
Import-Module SpeculationControl

Function Run {