The question of security is one that must continually be asked. Why?
According to the Gartner Predicts 2023 study, Cybersecurity Industry Focuses on the Human Deal, “By 2025, lack of talent or human failure will be responsible for over half of significant cyber incidents."
The landscape is constantly changing, and our adversaries are continually evolving and seeking vulnerabilities, requiring that we remain vigilant, more nimble, and secure. It’s not only bad actors and their machinations we must protect against, but the malevolent tools they deploy to wreak havoc and worse. (These can include bots, automated software applications that perform repetitive tasks over a network.)
RubyShield, RubyLaw’s security-focused hosting offering, provides the configuration, management, maintenance, and protection of web and database servers to ensure that your website, and other RubyLaw-powered solutions, remain secure.
But, what do we do to keep your properties safe?
One of the ways is by monitoring on a 24/7/365 basis from multiple geographic areas. This not only allows us to respond quickly to any potential issues or threats, but also to anticipate them. We also review and evolve our hosting configuration to optimize access for all users, something that clients on the latest versions of RubyLaw can truly appreciate.
Why is this challenging?
Above and beyond the volume of activities and varying degree of dangers is the fact that not all traffic to your digital properties is malevolent. Removing human visitors (good and bad actors), much of a site’s automated traffic is beneficial and productive, including bots that crawl, index, and scan. Some of these are building out search indexes, helping human visitors to find your site better and sooner.
RubyShield makes this process easier by helping to differentiate between “the Good” and “the Bad.”
What happens when issues or threats occur?
The first thing we do is attempt to determine the source. Once that is known, we can then block it via the IP address or network. In cases when the source is clear, and the attack is significant, we will notify client contacts. (We don’t reach out every time this happens, mostly because the vast majority of attempts are low-level and do not merit distracting clients with notifications.)
However, if an issue occurs multiple times, we will attempt to determine the patterns and address with added firewall configuration. (A firewall is a network security device that monitors incoming and outgoing network traffic and decides whether to allow or block specific traffic based on a defined set of security rules.) In extreme cases, we can block all non-verified bots and deploy rate-limiting rules that protect sites from any attacker that hits a site more than the desired minimum threshold over a short period of time.
One point of interest: It’s actually rare that the same attacker will attack multiple client sites. This speaks to the required vigilance, as well as to the volume of bad actors. If, though, we do identify this occurrence, we can manage this at the RubyLaw level and block an attack across all client sites.
What do we recommend?
We have several recommendations, each depending on a given client’s security requirements and preferences. One is initiating a crawl delay, which communicates to crawlers that they should slow down and avoid overloading a web server. Another is locking down RubyLaw access. We can also ensure that key firm locations and third parties are on our allowlist, particularly in the event that the latter are performing scans on your site.
It’s important to note: allowlists are client-specific because each has its own security posture, unique traffic shapes, and concerns.
If you would like to learn more about RubyShield, cybersecurity, and bots, please reach out to a RubyLaw representative. You can also attend an upcoming session of RubyLaw Live (see below).
RubyLaw Live, our monthly Q&A and expert conversation, is happening this Thursday, July 20 at 4pm ET. Our special topic will focus on RubyShield. We’ll share typical use cases, recent developments, and also explain bots.
This event is open to RubyLaw clients only.