According to findings from a proprietary database scanning tool, one out of every two on-premises databases in the world has at least one vulnerability.
The question is, why is data security so difficult?
This investigation reveals that the present data security strategy is ineffective. For years, organizations have prioritized and invested in perimeter and endpoint security measures, expecting that protecting the systems or networks that house data would suffice. However, because this is a vast and worldwide issue, such method is ineffective. Organizations must reconsider how they safeguard data in order to truly protect the data.
Several important tendencies were discovered during the research:
With nearly half of all databases globally (46 percent) containing a vulnerability and the average number of Common Vulnerabilities and Exposures (CVEs) per database standing at 26, it’s clear that businesses are ignoring one of the fundamental tenets of data security: patch and update databases as soon and as frequently as possible.
According to NIST rules, more than half of the vulnerabilities existing in databases worldwide are classified as ‘High’ or ‘Critical,’ allowing attackers to steal or manipulate sensitive data, gain control, or move laterally via the network after the database is infiltrated.
Not only are organizations failing to devote enough effort in patching, but it appears that certain databases have simply gone undiscovered, since we discovered CVEs dating back three to four years.
While the global results are disturbing, the regional split is even more so for developed countries such as France, Singapore, and China. These countries’ databases all have vulnerabilities that exceed the worldwide average, both in terms of the percentage of susceptible databases and the average number of vulnerabilities per database. Even in nations with a low percentage of vulnerable databases (19%), such as Germany, the average number of vulnerabilities is still rather large.
It is impossible to protect data unless you have a comprehensive understanding of all the locations where data is stored throughout the company, including rogue databases that have been set up outside the scope of security.
Because of the complexity of modern company, data has grown more diffuse than ever, making it critical to automate this discovery process to ensure that nothing is missed inadvertently. Importantly, this should include implementing technologies to detect database activity irregularities, as well as solutions to prevent vulnerabilities from being exploited.
In an ideal world, security teams would have enough time to fix every vulnerability in every database as soon as they were discovered. However, given the deluge of other duties from throughout the organization and the constraints on when fixes may be provided, this is becoming increasingly difficult to manage.
Businesses across all industries are pushing ahead with digital transformation programs and migrating data to the cloud. However, these findings show that managing on-premises security is already extremely difficult, even before considering the difficulty of securing data in the cloud. While digital transformation is critical for businesses to remain competitive, they must also have a clear, unified plan for protecting data and all access to it, regardless of where it lives.