UK drivers cover an estimated 216m miles a year without an MOT

A study of more than 2,000 drivers by company car insurance firm Direct Line for Business found that almost a fifth had “accidentally” driven their car …

British motorists drive an estimated 216 million miles a year in vehicles that do not have a valid MOT, according to new research.

More on the MOT test:

A study of more than 2,000 drivers by company car insurance firm Direct Line for Business found that almost a fifth had “accidentally” driven their car for at least a week after the MOT certificate had expired in the past five years. Even more worryingly, eight percent confessed to having driven without a valid MOT certificate for more than six months.

Rusted car a definite MOT fail in the UK

Of those who accidentally drove without a valid MOT for a week, almost half (45 percent) said they only drove once, but eight percent admitted to driving five or more times. Assuming the average motorist drives 21.6 miles every day, Direct Line says this means Brits could have covered as many as 216 million miles each year without a valid MOT certificate.

MOT certificates are a legal requirement for all cars used on the public road, unless they are electric goods vehicles registered before March 1, 2015, tractors or some classic cars first registered more than 40 years ago. Driving a vehicle that needs an MOT but does not have one is punishable by a fine of up to £1,000.

Mechanic inspecting car with torch

The MOT test involves checking over key vehicle components and ensuring the car is safe to be used on the road. Any faults will fall into one of four categories, ranging from ‘dangerous’ to ‘advisory’, via ‘major’ and ‘minor’. Vehicles with dangerous or major faults will fail, while minor and advisory faults will be noted down and should be monitored and fixed if necessary.

According to Direct Line for Business’ study, men are more likely than women to have driven without a valid MOT, with 20 percent of men claiming to have forgotten the test compared with 15 percent of women. Meanwhile those aged 18-34 are much more likely to forget than 35-54-year olds or 55-year olds, of whom just nine percent admitted to having driven without an MOT.

Halfords Autocentre MOT Service and Tyres centre in Basingstoke UK

Regionally, a third (32 percent) of Londoners admitted to driving without a valid MOT for a week, while 25 percent of drivers in the North East confessed to the same offence. Rounding out the top four were the North West (20 percent) and Yorkshire (19 percent).

Matt Boatwright, head of Direct Line for Business, said: “Keeping a vehicle roadworthy is a legal requirement and essential from a safety perspective for both the person driving and others on the road. We understand that having to turn down work because your van is in the garage can be frustrating, however not having a valid MOT could result in a hefty fine and in some cases lead to you losing your licence.”

Related Posts:

  • No Related Posts

What Happens If Your Uber Ride Crashes in LA?

All Uber drivers are required to carry liability insurance. There is also further insurance which is provided by Uber themselves. This covers the time …

Uber Ride CrashesUber Ride CrashesEven with the recent disputes over pay, there are still plenty of Uber drivers waiting to pick up passengers in LA. The ride sharing provider is popular in the city. The bad news is that more Uber drivers on the road naturally means there is an increase in accidents involving Uber passengers.

So, what should you do if you are involved in an Uber accident and what legal help might you need. The good news is that you should be covered by the insurance of one or the other driver involved in the accident or by the additional insurance that Uber provides. However, it’s not quite that simple.

The insurance situation with Uber drivers

All Uber drivers are required to carry liability insurance. There is also further insurance which is provided by Uber themselves. This covers the time from when an Uber driver accepts a request from a passenger and when the passenger exits the vehicle at the end of the journey.

The policy that Uber holds allows for increased coverage in cases where there is injury to a third party and the Uber driver is at fault. It also provides cover If another driver is at fault, a third party is injured, and the other driver is either uninsured or does not have sufficient coverage. As you can see, you should be covered for injury if you are involved in an Uber accident as a passenger.

But, what happens if liability is a gray area? This can happen when both drivers are at fault. There may be a reluctance to accept liability and pay out fully on claims.

Seeking legal help when there is an issue

If you have been caught up in an accident as an Uber passenger, and the situation becomes complicated, it’s worth seeking legal help. This is because you want to make sure that you get the compensation that you are entitled to.

It can be especially useful to do this when there is more than one passenger in the vehicle. This is because when several different claims are made the situation is complicated further. You should still certainly be entitled to compensation but getting legal help can make the process easier for you. It can also give you peace of mind because you feel as though someone is offering you the support that you need.

In summary

Hiring an Uber in LA is one of the cheapest and most popular ways of getting around if you do not want to drive yourself or use public transport.

However, if you are a passenger in an Uber vehicle there is always a chance that you could be involved in an accident. If this happens you should of course make sure that your health is protected first by calling 911 and getting assistance, if you are injured. You should also be sure to make a claim for compensation. If the situation is complex, or you simply feel as though you need help, you should consider seeking legal advice.

Jeremy Biberdorf

Print Friendly, PDF & EmailPrint Friendly, PDF & Email

Related Posts:

  • No Related Posts

Uber driver involved crash warns drivers of insurance loophole

NASHVILLE, Tenn. (WTVF) — An Uber driver involved in an accident while on the clock is being forced to pay thousands of dollars to fix her car, and …

NASHVILLE, Tenn. (WTVF) — An Uber driver involved in an accident while on the clock is being forced to pay thousands of dollars to fix her car, and she said the reason is a loophole other rideshare drivers need to know about.

Jennifer Reed has been driving for Uber for about a year to make extra money while she pursues her dream of becoming a Pilates instructor.

Last week, she was waiting to pick up a rider, when she rear-ended another car on I-40 near the Nashville International Airport. The result was a few bumps and bruises, and damage to the front and side of her car. Her airbag also deployed, and her seat belt is now broken.

She reported the accident to her personal insurance company only to learn she wasn’t covered.

“They said as soon as I turn on my app, my personal insurance turns off because my car is considered a business,” said Reed.

On Uber’s website, the company says it provides insurance to every driver. Reed reported her accident through the Uber app, only to find out the company’s insurance wouldn’t cover her either.

“Their reason was I didn’t have anyone in the car,” said Reed. “The whole reason I was in the car was to drive for Uber. What is the point of having insurance if they don’t protect you?”

Reed said she has struggled to get any clear answers from Uber, and she fears she isn’t the only rideshare driver to be a victim of this type of situation.

“Other Uber and Lyft drivers need to know there is a gray area that you can fall into, and this can set you back thousands of dollars,” said Reed.

Uber states its insurance policy for drivers on its website. It says if a driver is in an accident and is not using the Uber app, their personal insurance will apply. However, if a driver is available or waiting for a ride request and is involved in an accident where they are at fault, Uber offers third party liability insurance to cover bodily injury and property damage. This only covers damage to another person or another vehicle. If the driver is en route to pick up riders or is on a trip, Uber will cover third party liability, and damage to the driver’s vehicle subject to a $1,000 deductible.

Reed was forced to pick up another job to help make ends meet, and said she feels both insurance companies have taken advantage of her. She added Uber needs to clarify its policy so drivers fully understand what is covered.

“It’s unfair. It’s criminal and they are shirking their responsibilities,” said Reed.

For more information on Uber’s insurance policies visit:

Related Posts:

  • No Related Posts

Uber warns not to impede innovation or over-regulate self-driving vehicles as it hints at moving into …

Uber has issued a warning to the UK not to impede innovation or create artificial barriers as it hints that it may move to bring autonomous vehicles into …

Uber has issued a warning to the UK not to impede innovation or create artificial barriers as it hints that it may move to bring autonomous vehicles into the UK market.

Uber told the Law Commission that there may potentially be “natural tension” between innovation and safety.

They recommended that regulators should steer-away from making solid decisions until there is evidence available to support those decisions.

Uber’s self driving program is no stranger to controversy. In March of last year it temporarily ceased trialling it’s autonomous vehicles after one of the cars killed a woman in Tempe, Arizona.

Uber, who are believed to be working towards what is known in the tech-industry as “level four” autonomy were the only US company to respond to the Law Commission’s consultation on how self-driving vehicles could be regulated.

Level four autonomy means that cars have the capability of piloting themselves on certain types of road, whereas level five autonomy is an indicator that an autonomous vehicle can drive itself in all road situations.

The Law Commission, is looking for responses to proposals for the regulation of self-driving vehicles.

Those proposals could include criminal proceedings against anyone who is negligent while in charge of an autonomous vehicle. There could also be a review of the corporate manslaughter legislation in relation to death or serious injury caused by any given company’s autonomous vehicle.

Other aspects to the proposals include asking whether a vehicle should be able to mount the pavement to avoid an accident and also to allow emergency vehicles to get past it, and also whether the vehicle should be able to creep through heavy pedestrian traffic.

Another aspect brought up in the proposals is the question as to whether there should be a back-up driver, or remote access to a third party who can control the vehicle externally when necessary.

The Telegraph reported that the Law Commission was looking at a further consultation on legislation for autonomous taxi fleets.

Transport for London said: “If the ‘user-in-charge’ were to be based outside the vehicle and responsible for many vehicles remotely, this could present concerns if they needed to resume control of multiple vehicles.”

Image Source: Wikimedia Commons

Image Author: Diablanco

Related Posts:

  • No Related Posts

Tesla Autopilot Safety Stats Said Imbued With Statistical Fallacies, Interpret Cautiously

As I’ve mentioned in my column posts, Tesla and Elon Musk’s vision for the realization of semi-autonomous and fully autonomous cars is …

For all autonomous car makers, it’s important to scrutinize their self-reported roadway-safety record, including Tesla’s.

Christopher Goodney/Bloomberg

As I’ve mentioned in my column posts, Tesla and Elon Musk’s vision for the realization of semi-autonomous and fully autonomous cars is commendable and has undoubtedly helped spur progress on advancing toward achieving self-driving driverless cars. But it’s constructive to consider the nature of the statistics provided to the public by both Tesla and Musk when it comes to asserting the miles-safety related triumphs they purport to have already accomplished.

Why try to unpack claims regarding miles-safety stats?

For both those within the autonomous car industry and those outside it, there needs to be a realistic understanding of what the existing semi-autonomous and autonomous car capabilities are, and so any time that any automaker or tech firm reports their stats, it is worthwhile to examine closely the provided numbers.

Let’s be clear about my key axioms on this matter:

• No firm should be immune to such scrutiny.

• All autonomous car makers should be willing to share their stats and do so in a manner that offers a verifiable and veracious indication of their latest and ongoing status (referred to as “safety data transparency”)

• And please be aware that I say this about any and all autonomous car developers and am not singling out Tesla per se and am preparing a series of columns as a likewise analysis of other driverless car makers.

Let’s consider several important background aspects before we dig into the numbers.

Dangers Of Spreading Fake News About Autonomous Cars

Some point to the handful of states that require disengagements reporting as a seeming showcase depicting the autonomous car tryouts taking place on our public roadways (a “disengagement” is generally counted as an instance of a human back-up driver having to take over control of an autonomous car during public roadway tryouts, though there is a lot of wiggle room in the definition).

Those disengagement reports are not particularly revealing and nor a viable means to grasp and portray what the true status of the matter is.

Indeed, when the disengagement numbers are released by various governmental entities, there is often a flurry of news accounts about the statistics, a kind of unruly rush-to-judgment to meet the voracious news-cycle demand, yet if the media promulgating those numbers took a moment to carefully dissect the figures, they’d be doing a much better job of informing the public in a more rightful manner.

We need to try and curtail the rise of fake news about autonomous cars, which I’ve been decrying for several years now as a widening malady.

Unfortunately, at times the media merely passes along the reported stats and inoculates the numbers with an impression that the counts must be forthright and revealing, yet that often is not the case. Don’t though mistake my remarks as meaning that I am suggesting that we should abandon such reporting, I am simply asserting that regrettably the nature of the metrics used and how they are being reported is essentially weak and somewhat vapid, and I’ve exhorted that more robust metrics need to be put in place and offered suggestions thereof.

Tesla and Musk have already been reproached about some of their prior roadway safety claims, perhaps most notably when the National Highway Traffic Safety Administration (NHTSA) finally released data that took a federal lawsuit to get into the public’s hands, and showcased that the previously and widely touted 40% claimed reduction in crash rates by Tesla’s introduction of their Autosteer feature was not quite up-to-par. Analysts characterized the claims as “implausible” and not supportable by the data, and others even used the more profane word of “bogus” after reviewing the data (and some say there’s “no reason” to trust the numbers).

Tesla Quarterly Miles-Safety Data Reports

Starting in October of last year, Tesla began reporting what they describe as quarterly safety data, doing so at their publicly accessible web site. When you hear that it is “quarterly safety data” you might be inclined to assume that it is a slew of detailed data that provides a richness of information, making available a plethora of safety data for public consumption.

Not quite.

Essentially, it is a single paragraph that has two numbers reported, consisting of the purported number of miles driven prior to a car accident or crash occurs, shown as one number for the case of Autopilot engaged and a second number of when a Tesla car was being driven without Autopilot engaged.

That’s it.

For those wishing to do any kind of analysis about safety-related aspects of Tesla cars, the providing merely of two numbers could be characterized as a paucity of data (some might use more stringent wording).

In brief:

• There is no indication of the means by which Tesla arrived at the two numbers (whether raw or transformed first).

• There is no apparent means for anyone to verify the veracity of the two numbers.

• There is no underlying data provided that could be used in the furtherance of interpreting the numbers.

• There aren’t any other numbers other than the two numbers, but for which there are many other stats that they could provide, given that they are able to provide these two numbers.

• There would not seem to be any additional “extra” effort or cost that Tesla might incur to provide additional data or additional stats since presumably they are already doing so for their own internal purposes.

• There would not seem to be any qualm or concern about somehow revealing any company proprietary secrets by releasing more so stats or underlying data since it can be readily done without divulging any technology or IP (Intellectual Property) that they might wish to keep private.

• Etc.

Be that as it may, let’s see what those two numbers are and how Tesla to-date has made claims about them as an indicator of the Tesla cars mile-safety track record.

Reported Tesla Numbers And What They Mean

Here are the reported numbers (for clarity, these aren’t my numbers, these are the Tesla reported numbers) and for which the sole metric is stated as being one car accident or crash-like event per miles driven:

Q1 2019: 2.87M miles driven-to-crash incident in which Autopilot was engaged

Q1 2019: 1.76M miles driven-to-crash incident without Autopilot engaged

Q4 2018: 2.91M miles driven-to-crash incident in which Autopilot was engaged

Q4 2018: 1.58M miles driven-to-crash incident without Autopilot engaged

Q3 2018: 3.34M miles driven-to-crash incident in which Autopilot was engaged

Q3 2018: 1.92M miles driven-to-crash incident without Autopilot engaged

Another way to group the numbers would be by showing them over time and whether in the Autopilot engaged or not engaged case:

Q1 2019: 2.87M miles driven-to-crash incident in which Autopilot was engaged

Q4 2018: 2.91M miles driven-to-crash incident in which Autopilot was engaged

Q3 2018: 3.34M miles driven-to-crash incident in which Autopilot was engaged

Q1 2019: 1.76M miles driven-to-crash incident without Autopilot engaged

Q4 2018: 1.58M miles driven-to-crash incident without Autopilot engaged

Q3 2018: 1.92M miles driven-to-crash incident without Autopilot engaged

So, what do these stats mean?

First, keep in mind that the higher the number the better things are, in the sense that it implies that the longer or more miles a car went prior to getting into a car accident or crash. Theoretically, you would never want any car accidents or car crashes, in which case we might aim for the value of infinity, which I say in some jest, but the point is that the goal is to maximize the magnitude of the miles driven before there is a car crash that occurs.

You might have noticed that the numbers when the Autopilot was not engaged are less than the numbers when Autopilot is engaged. This implies that the Autopilot engaged grouping goes further along before incurring a car crash, or you can also consider that it means the grouping without Autopilot engaged gets into a car crash in a lesser distance traveled than those of the Autopilot engaged grouping.

As an aside, one somewhat small point but worth mentioning involves the group that involves the “without engaged” Autopilot portion. Presumably, this includes Tesla’s that have Autopilot but for which the Autopilot was not engaged for some portion of traveling and thus those count as not-engaged miles (you might drive on a single journey of 60 miles, going 10 miles while using Autopilot, and in that same journey not use Autopilot for say 50 miles, and therefore rack-up 10 miles into the Autopilot engaged group and 50 miles into the “without engaged” group). Also, a Tesla without Autopilot available would presumably only rack-up miles into the “without engaged” group, by definition.

For ease of comparison between the Autopilot engaged group and the not-engaged Autopilot grouping, the average for the Autopilot engaged is mathematically 3.04 and the grouping without Autopilot engaged is 1.75, by simple math, and you could suggest that therefore the Autopilot engaged group is about two times the distance before a car crash occurs (actually being about 1.7x, rounded to the number 2).

If you look at the numbers over time, you’ll notice that the Autopilot engaged group did “better” in Q3 2018, but then appeared to drop by about 13% in Q4 2018 (in a sense, worsening), and then further dropping another 1-2% for Q1 2019.

You’ll likely also notice that for the without-Autopilot engaged group, it dropped (worsened) from Q3 2018 to Q4 2018 by decreasing about 18%, and then somewhat rebounded for Q1 2019 (improving from Q4 2018 but still below Q3 2018)

Some have made a bit of a ruckus about the fluctuations in the stats, but I’m not going to entertain herein that matter, partially due to trying to keep this discussion to a manageable size (though I do have some thoughts about it). Plus, I think there are more pronounced aspects to be considered, as next discussed.

Possible Claims Made Based On The Tesla Reported Numbers

Within the paragraphs posted by Tesla, they indicate that the NHTSA nationwide data indicate that nationally there is a car crash every 436K miles (in Q1 2019 and likewise for Q4 2018) and was 492K in Q3 2018.

As such, this suggests that the average distance of the Autopilot engaged group of 3.04M miles per crash is presumably much greater (better) than the national average of cars overall (about 7x), and the average distance of the Autopilot not engaged of 1.75M miles is also presumably greater (better) than the national average though somewhat less so than when the Autopilot was engaged (about 4x versus the 7x), implying that overall, the Tesla cars are going longer distances before incurring a car crash than the national average – as earlier mentioned, the larger the miles driven before incurring a car crash is something considered desirable.

In any case, here’s what seems to be the preponderance of the overall claims sometimes made about these particular stats:

Overall Claim #1: On a comparative basis, the use of Autopilot versus not using Autopilot appears to presumably indicate that the driving task is being undertaken on a safer basis when the Autopilot is engaged (since we have that 1.7x or rounded to 2x greater distance traveled prior to a car crash).

Overall Claim #2: On a comparative basis, the Tesla cars are presumably being driven on a safer basis overall since the Autopilot engaged group miles-to-crash is larger than the national average miles-to-crash (by about 7x), and so is the not-engaged Autopilot group higher than the national average of miles-safety driven (by about 4x).

There are pundits in the autonomous car industry that like to latch onto the first claim as it suggests that autonomous cars are safer than non-autonomous cars, but this is a quite misleading and a false interpretation, I assert. Keep in mind that Tesla cars are not fully autonomous as yet, they are at a Level 2, rather than a truly autonomous Level 5, thus this is only about semi-autonomous cars, ones that involve a co-sharing of the driving task by human and machine, rather than the machine or AI being the sole driver of the car.

Given that caveat, some pundits will then back-down to saying that it at least “shows” or “demonstrates” that using semi-autonomous capabilities leads to safer driving (since, on a comparative basis, the Autopilot engaged group seems to go a greater distance before incurring a car crash than the not-engaged Autopilot group).

There are a number of potential statistical fallacies involved in these kinds of proclamations, and it would be instructive to reveal those potential fallacies.

Statistical Fallacies When Trying To Interpret The Safety-Miles Stats

Statistics are an important element in AI areas such as Machine Learning, Deep Learning, and probabilistic reasoning. One of the books I often recommend is the somewhat eye-opening book “How To Lie with Statistics” that was written by Darrell Huff and has stood the test of time in terms of laying out some of the most commonly misunderstood aspects about interpreting statistics.

Studies show over and again that people often fall into numerous mental traps when interpreting statistical data and are vulnerable to statistical fallacies.

Here’s an example. I ask five people to wear regular shoes to play a basketball game against five other people that are wearing specialized basketball-playing sports shoes. After playing a fierce basketball game, the team wearing specialized sports shoes wins. What might you conclude? I’m sure that most would believe that the specialized sports shoes made the difference in terms of why the one team beat the other team, having been given an added edge attributed to those sporty shoes.

But suppose I then told you that the team wearing regular shoes were only allowed to make shots from beyond the free-throw line, while the other team could make shots anywhere and including right next to the basket. Whoa! You might now change your mind and say that it wasn’t particularly the shoes that made the difference.

For the Claim #1, namely that the use of Autopilot appears to have led to greater safety since the miles-safety distance stat is larger than the group that did not have Autopilot engaged, your likely first impression is that it must be the Autopilot usage that led to safer driving.

But, hold on a moment, and let’s think about that, maybe there’s a statistical fallacy lurking in that thinking.

Please be aware that in the autonomous car industry, it is well-known that not all miles are the same, meaning that if you put an autonomous car onto a wide-open highway with minimal traffic, it is a lot easier for the AI than if you put that same autonomous car into a dense city setting that is chock full of traffic, pesky pedestrians, and other “car accident” inducing elements.

Here’s the question for you, when the Autopilot is being engaged, what is the nature of the roadway driving involved? Is it more likely that the miles racked-up while on Autopilot are on the highways rather than in other settings? Does this potentially suggest that we might be comparing apples and oranges? Similar to the story about the basketball team, our attention might be focused on the shoes (i.e., Autopilot) when it could be that the type of miles being driven when Autopilot is engaged are less so likely to incur car crashes.

If Tesla would provide the underlying data, anonymized for privacy protection, it would be possible to potentially analyze to see what type of driving areas are used for Autopilot engagement, and then possibly find a comparable subset of data of the not-engaged Autopilot data to see whether those are showcasing differences (it would be a more suitable basis for comparison, though we might still have additional confounding or underlying factors).

Overall, it would be best to cautiously consider the Claim #1 as “we don’t know” whether it really portends any significant difference due to the use of Autopilot, rather than trying to leap to a brazen or unsubstantiated conclusion.

For Claim #2, there is a somewhat similar potential statistical fallacy involved.

Returning to the basketball teams, I bring forth a third team, wearing just regular shoes. I make the basketball game rules all equal, meaning you can shoot from anyplace on the court that you prefer. The team with the specialized shoes handily beats this new team, and furthermore, the other team that had only regular shoes also beats this third team. Why? Turns out that initial two teams were composed of college basketball players, while the third team was a hodgepodge of people that just happened to be around when composing the third team.

In Claim #2, it appears that the Tesla with Autopilot engaged drivers are safer than the nationwide driver car crash stat, and so too is the Tesla without Autopilot engaged drivers. Does this mean that somehow Tesla cars are safer than all other cars? We have no reasonable way to make that conclusion.

Keep in mind that Tesla’s tend to be more expensive cars to-date, aiming toward a luxury line, and the buyers are demographically likely a niche, differing from everyday drivers per se. Furthermore, Tesla buyers are generally considered “early adopters” which is yet another kind of niche or slice of everyday drivers, wherein the masses or “late adopters” have not yet (presumably) become Tesla drivers.

It could be that the type of driver that is driving Tesla cars is unlike the overall mix of all drivers that is reflected in the national miles-to-crash statistic. In that sense, it could be that the driver is the difference, rather than the car itself.

Or, maybe Tesla cars are primarily being driven in areas of the country that are not representative of the driving encompassed by the national stat, thus, perchance the Tesla cars are being driven in places that tend toward having fewer car crashes or car crashes at longer intervals of driving.

There are lots of other potential intervening elements that make the comparison problematic, which again if the data was available it would allow for a closer analysis to try and see whether those other factors could be examined.

Plus, there is oftentimes a big difference between being able to achieve an effort in a shorter run versus a longer run, pointing out that the distance annually traveled by all cars in the U.S. is about 3.2T, which is what the national stat is based upon, while as far as has been reported to-date it might be that Tesla cars have been driven perhaps 10B miles (thus, a mere fraction of the overall national driving distance). Running a sprint versus running a marathon can produce quite different results when trying to compare the running times and stats.


Correlation does not imply causation, it’s a key mantra for anyone that is versed in statistics and statistical reasoning.

We all tend to assume that whenever we see stats that they are based on some kind of random sampling, even though they oftentimes are not, and we also assume that any intervening variables are being controlled, which usually they are not (such rigorous setups only usually happen when performing a laboratory-based experiment, or similar kinds of more statistically planned research efforts).

Mark Twain was famous for popularizing a quip: “There are three kinds of lies: lies, dammed lies, and statistics.” It is important that any stats that are provided by any automaker or tech firm about their semi-autonomous or autonomous cars be carefully scrutinized and not be inappropriately interpreted or utilized.

Related Posts:

  • No Related Posts