The DataDog Cloud Security Report, The SEC Charging 4 Companies For Misleading Cyber Disclosures, and Apple Releasing Open-Sourcing Private Cloud Compute Resources.
Learn more about The Radar here
Datadog has unique insight into the cloud environment of many vendors. Every year, they publish their “State of Cloud Security Report.” Here are my takeaways on each point.
While this report is phenomenal, I do wonder if it suffers from some level of survivorship bias. The data from the report “has come from customers of Datadog Infrastructure Monitoring, Datadog Logs, and Datadog Cloud Security Management (CSM)”. This means companies that do not have the budget for expensive security services are not included, potentially skewing the data.
The SEC charged four companies with civil penalties in regards to the SolarWinds hack. Unisys ($4,000,000), Avaya ($1,000,000), Check Point ($995,000), and Mimecast ($990,000) but the reputation damage is a far greater concern. The commonality is that each of these companies downplayed and minimized the severity of incidents. The SEC said: “The federal securities laws prohibit half-truths, and there is no exception for statements in risk-factor disclosures.”.
There are few companies named, but this signals a silent but necessary shift to placing more responsibility on companies to be forthcoming about the scale and scope of the incidents. I would imagine every fortune 500 company security department is going to be utilizing this as justification for getting more resources. If you don’t have the tools, resources, or people to confidently say what data was lost, I would identify that as a business risk.
Back In June, Apple announced their “Private Cloud Compute” technology which allows for workloads originating on an Apple device (with relatively low computing power compared to cloud/data center hardware) to be processed in on a device in the cloud with higher, more specialized, computational hardware. The unique aspect of this is the security guarantees Apple provides around the AI’s processing of the data, specifically, apple states: “user data sent to PCC isn’t accessible to anyone other than the user – not even to Apple”.
Apple, who is not typically known for being open about their hardware and software, has published technical details about the architecture of their private cloud compute environments, a set of tools that lets security researchers attempt to break Apple’s security guarantees, and releasing some of the source code. Additionally, their bug bounty program for Private Cloud Compute reaches up to $1,000,000 for “Remote Attacks on Request Data”.
A release of information, tooling, and source code and a potential $1,000,000 bug bounty payout to security researchers to break their security shows that Apple is VERY confident in their design. For context, $1,000,000 is approaching the market price of an iMessage remote code execution and privilege escalation zero-day on Zerodium ($1,500,000) which is highly unusual. Apple does not historically go the open-source route.
I think Apple knows that people are generally wary of trusting AI systems. Apple was a bit late to the AI game, but I think the angle they’re taking of attempting to make the most trusted AI system is a very smart move. I hope it’s successful and I hope it inspires Apple to go more of the open-source route in the future.
#:~:text=the text you wish to highlight
. For example: https://en.wikipedia.org/wiki/Cast-iron_cookware#:~:text=soap
Want this sent straight to your inbox? You can subscribe here to have it sent every Monday morning.