Clumio announces $75M Series D and 4X YoY growth in ARR
Happy Pi Day! A day to celebrate things that are round, and also a day to celebrate modern infrastructure!
Amazon S3 was released on this day 17 years ago. I don’t know about you, but I’ve always been a pie person over cake for birthdays, so today I’m going to eat a slice of pie and celebrate by sharing thoughts on what security teams can do to better secure S3.
S3 has been around longer than I have been working in technology by a hair. As I looked back to frame my thoughts, I was surprised at what other technologies were founded at the same time – blu-ray, Twitter, Shopify, Nintendo Wii, and Playstation 3. 2006 was a pretty big year for tech. Gary’s Mod, a fun, albeit strange, open world building game was released in 2006 too, which I was mostly focused on at the time. That, and the Core 2 Duo (Read more on 2006).
The wildest part is that while almost everything else in that list has changed substantially, S3 foundationally is very similar, which is a good thing! It shows one of the values of the cloud: an abstraction where the end user doesn’t have to be concerned with Moore’s Law. S3 delivers the same service to consumers, while the underlying hardware constantly changes.
What’s not a good thing is that security practices in the cloud are still lagging behind. We have seen substantial breaches only a few months into 2023 impacting cloud data. Ransomware has become commoditized, and threat actors continue to look for more opportunity. That opportunity is where the data lives – the cloud. Below are a few thoughts on what organizations can do to start improving their data security in S3.
I want to note this is not an exhaustive list. Your cloud security strategy should be comprehensive with foundations in IAM, removing “shadow cloud”, DevSecOps best practices, and resilience. The tips below are three parts of the strategy. If you’re on a cloud security journey and would like help organizing your thoughts, feel free to reach out.
One of the foundations of a good security practice is knowing what you have to defend. In security we sometimes call this a “crown jewels” analysis. From a cloud perspective, this means both infrastructure and data that are either important to business operations, or regulated.
In S3, I see many folks struggle with visibility into where data lives. The root cause is typically a weak relationship with the teams generating and manipulating data – the developers.
Tip 1.0466… Security starts with knowing what data matters. Have your team spend time with DevOps. Build synergies rather than become the department of rules. Speak with not just IT leaders, but the people doing the work. Ask engineers to diagram data flows. Silos create the biggest failures. In addition, look into DSPM tools to help classify existing infrastructure. Be sure to select a DSPM tool that works for your data.
Once you have established a complete view of the data you’re storing in S3, you can start “Data typing,” or categorizing data by business requirement, mapped back to regulatory and business retention requirements. From there you can…
Tip 1.0466… Configure S3 Backups and S3 Expiration, and design a cloud data policy to meet your business and regulatory requirements.
Data stored in S3 that is governed by privacy laws such as CCPA or GDPR has to be retained and deleted on a different schedule than, say, email data regulated by FINRA.
Data retention strategy can be complicated. It varies by vertical, contractual obligation, and data type. Where to store the data needs to also be determined by purpose and cost. In many cases the data has to be searchable (Clumio offers searchable backups!). Ensure you consult the right experts when you make these decisions; this is a cross-functional effort where your finance, compliance, legal, and security departments need to collaborate.
One of AWS’s well-architected pillars is reliability. In addition to the regulatory commitments of an organization, data needs to be operationally resilient. Many architectures on AWS, even those that split workloads into multiple availability zones, have one central data lake or bucket. The biggest myths in AWS architecture are often related to resilience. “It’s on AWS, it must be safe and resilient.” The service is resilient, yes, but there is no guarantee for the resiliency of the data, configuration, or other components that turn building blocks into functional applications.
Tip 1.0466… Backup up your S3 data.
This one is pretty self explanatory. Replication, versioning, and “write once read many” S3 configurations are not fit for many cost-effective business applications. They also don’t solve malicious attacks. DR and BC plans need multiple recovery stages. Active/Active is an infrastructure protection strategy more than it is a data protection strategy. (Want a deep dive? See “Using Replication for Backups? There Are Better Options.”)
Ensure when you speak with your DevOps teams you understand the pros and cons of data reliability methods, how each resiliency decision is made, and how it impacts operational cost, complexity, and regulatory compliance. Again, work together to build architectures that meet the needs of all business requirements, not just what ships fast.
That’s it! 3.14 tips for Pi day!
Tip 1.0466… Classify data. DevOps, Legal, and Security all need to work together. DSPM or similar tooling may help with data classification.
Tip 1.0466… Retain and delete data in a compliant manner.
Tip 1.0466… Backup your S3 data.
Go buy a pie, reflect on your S3 data strategy, and enjoy Pi day!