News  

Microsoft Defender failed to improve in the latest issue of the AV-Comparatives rankings

Microsoft Defender has been doing quite well in the AV-TEST rankings in recent years, although it did drop in the latest assessment. And, despite some positives in AV-Comparatives’ latest May 2022 real-world conservation test, Microsoft Defender shows no signs of improvement after a landslide in these assessments.

In last year’s real-world protection test, Defender failed to get any score due to a bug. Later, testers found that the offline detection rate of Microsoft’s products was quite low, and it also occupied quite a lot of system resources.

The latest May 2022 real-world protection test results show that Microsoft Defender got the best results in the False Positives test category. Along with ESET, Defender has a false positive rate (FPs) of zero. This may be a bit surprising, as there have been some recent reports of false alerts about Defender.

The evaluation process for each test case will identify any changes between malware files executed on each test machine. The testing process waits a few minutes for malicious behavior after the malware is executed (if it has not been previously blocked), while also giving the behavior blocker time to react and correct the behavior performed by the malware. If no malware is detected and the system is indeed infected/compromised, the process enters the “System Compromised” phase. Otherwise, the product is considered to have protected the test system unless user interaction is required. In this case, if the user’s worst decision causes the system to be compromised, it will be rated as “user-dependent”.

When the product under test asks the user to make a decision, we always choose the option to let the program run (e.g. “Allow”). If in the case of doing so, the program is still blocked, the program can be considered to have protected the system.

For each false positive that depends on the user, a score of 0.5 is deducted, and the final score is placed in the FP score column, which you can see in the table image above. The worst security software here is Malwarebytes and Trend Micro, both of which have more than 40 cases.

The entire assessment tested a total of 725 cases, Defender was compromised 7 times, and overall Defender received a 99% protection rate.

 

 

TotalAV performed worst on the test because it had 15 samples that were destroyed. Interestingly, TotalAV is said to utilize the Avira engine, but the former performed so poorly that the latter became one of the best anti-malware protections in the test.

The following are the final protection awards awarded by AV-Comparatives based on the performance of these products:

 

Despite the low detection rate of Defender, the product won the AV-Comparatives ADVANCED Award due to its excellent score in the False Positives category. At the same time, for this reason, other products such as Malwarebytes, Norton, and Trend Micro have suffered considerable setbacks in this final award ranking.