Cloud security basics: 9 security issues to address as you move to cloud services


The scalability advantage of cloud computing can only be sustained with the application of cloud security basics. A cloud service provider takes care of the physical security of their data centres, while the organization storing data up there needs to take responsibility for their own cloud security configurations. Cloud providers are addressing concerns with added security features, yet we continue to see data in the cloud compromised due to misconfigured settings. This tells us just how widespread the issue has become since reporting on this last year. We’ve highlighted this and 8 others cloud security basics every cloud users should know:

1) Data Breaches and misconfigured cloud storage:

What can we learn from recent S3 bucket leaks? Misconfigurations are common, they happen and they can be fixed easily. If we don’t fix it, that’s when disasters occur. Many instances were due to misconfiguration or weak configuration of the access control list. Whether you’re managing Amazon S3 buckets, Azure blobs or Google cloud storage, it’s something that the organizations must take ownership over to safeguard sensitive information stored in cloud storage units. By doing so, you can make sure the right people have the right credentials to access to keep out malicious hackers and any other unauthorized users including third party vendors.

2) Check for forgotten subdomains:

Subdomains can be taken over by a hacker with the help of external services. In 2014, the Detectify security researcher team discovered a serious attack vector which allowed one to take control over a subdomain due to DNS misconfigurations, and in a manner that is not noticeable to the domain owner. Thanks to this research, we automated tests to check for this called Asset Monitoring. If you are not using our scanner, you can still remediate this manually by looking through all DNS-entries and removing all entries which are active and unused OR pointing to external cloud-based services which you do not use anymore.

3) Weak Identity, Credential and Access Management:

Using Two Factor Authentication, password or identity controls and proper employee off-boarding are simple measures to take to ensure information doesn’t fall into the wrong hands. Not only are strong password protocols encouraged, but it is also important to encrypt the information traffic flowing by implementing SSL/TLS certificates and setting secure email protocols like SPF- and DMARC-records. When an employee leaves a company, their access accounts should be deactivated immediately to ensure they’re not forgotten and vulnerable.

4) Broken authentication:

It’s critical that a user should not be able to execute functions they are not authorized to do on cloud services. An example of this would be denying an unauthenticated person from uploading files into a “protected” cloud storage bucket. This is defined by an upload policy with a set of requirements and unfortunately, these are at risk for weak controls as shown in Frans Rosén’s research, Bypassing and exploiting Bucket Upload Policies and Signed URLs. It is recommended that an upload policy should be created specifically for every file-upload request or per user.

5) Check that user details and API keys are not left out in the open:

With sharing comes responsibility that you’re not sharing too much. Several high-profile companies including UBER and Slack have learned the hard way unfortunately as they accidentally uploaded code onto Github without attention to the details of sensitive user information. The ubiquitous use of GitHub and other open source coding platforms benefits the developer community with knowledge and best practice sharing. However security still applies here and code that is uploaded, especially legacy code, should be checked that no sensitive information like passwords, user tokens or API keys are exposed. Default settings should not be relied on either as they’re often set to ‘public’.

6) Logging and Monitoring:

Good practices around logging and monitoring activity on the server are essential to keep on eye on the cloud security status. With sufficient logging and monitoring practices you may become better aware of any malicious activity and can answer the questions, “are we even interesting enough to hack?” However collecting information here is not enough, immediate action is also required to ensure any substantial risk to the cloud is mitigated as soon as possible.

7) Continuous monitoring of common vulnerabilities and patch them:

Injection is listed on the OWASP Top 10 vulnerabilities list and for good reason. Cloud services can be exploited with injection attacks. If you’ve migrated to cloud, it’s especially needed to check the security status in legacy code. SQL injection is a prevalent modern vulnerability and when detected it should be patched without hesitation. It can easily be automated which makes the risk even higher. Conveniently, they are detected easily with an automated scanner. This concerns vulnerabilities including XML External Entity (XXE) and Server-Side Request Forgery (SSRF).

8) Always update your technology:

Using the latest version of technology is crucial for security. Often patches are released with bug fixes but not everyone feels the urgency to install them, leaving applications vulnerable.  “jQuery is a good example of this where multiple outdated versions of this framework are used despite all their known vulnerabilities,” Detectify CIO Johan Norrman explains about lack of updating, “someone has even developed a website to make the information readily available.”


Detectify CIO Johan Norrman sees a lack of updating tools to be a security concern.

9) Due diligence and cleaning up superfluous tools:

Even if you cover the cloud security basics you may still be at risk for a breach, which means doing your due diligence on the incident response routine is needed. You can stay prepared by rehearsing the contingency steps, test your recovery and make sure the backups work. When auditing your toolbox, apply the use it or lose it rule. This eliminates the need for keeping unused tool updated or left as a preventable risk. As the CSA states, “this applies whether the company is considering moving to the cloud or merging with or acquiring a company that has moved to the cloud or is considering doing so.” They published a list of cloud security risks, the Treacherous 12, last year.

How can Detectify help?

With a tool like Detectify, you can continually monitor your web apps with a scanner that is updated with security tests at least bi-weekly to keep up with the fast rate which web vulnerabilities could be found. We are a SaaS-service, hosted in the cloud, which means you can scan your web applications without downloading any software. We test for OWASP Top 10 versions 2013 and 2017, AWS S3 Bucket misconfiguration and various key disclosure vulnerabilities. We also offer Asset Monitoring to help identify potential vulnerabilities related to DNS misconfigurations. Our services are hosted on AWS and we are also recognized as a preferred technology partner of AWS, and offer a connector to Route 53 so you import information from your DNS directly for monitoring.

How does it work? The moment you log into the tool, you’ll be running the most updated version. We start up a server on AWS to scan your web applications and once that’s done, we report findings to you and then the server is killed. None of your web application data is stored by us on AWS. It’s the beauty of the cloud.

Are you ready to try out Detectify with your cloud services? Sign up for an account and scan with a free trial here.



Source link