AWS S3 Misconfiguration Exposes Personal Information of Nearly 200,000,000 Voters Multiple security reports have recently highlighted the dangers of cloud computing misconfigurations. This has resulted in vulnerabilities that are now manifesting in the real world. Personal information of nearly 200 million voters was exposed to an Amazon Web Services-hosted S3 account. Deep Root Analytics, a Republican data company working for the Republican National Committee (RNC), left the data exposed. Security firm UpGuard Inc. discovered this data. “In total, the personal data of potentially all of America’s registered voters was exposed,” UpGuard stated in a post that was last updated yesterday. Many vulnerabilities and threats have been created by misconfigured cloud-based data storages, such as the recent spate ransomware attacks on MongoDB databases, Elasticsearch repositories, and other sources. Security firms have been attempting to find such vulnerabilities have made the misconfiguration known. Chris Vickery, an UpGuard security analyst, discovered the exposed voter data while searching open cloud repositories. Deep Root Analytics’ data repository contained an AWS S3 bucket that didn’t have access protection. UpGuard stated that anyone with an Internet connection could have accessed Donald Trump’s Republican data operation by simply navigating to a six character Amazon subdomain: “dra-dw”. It was not clear that any attackers had downloaded any data for malicious purposes. The UpGuard report is just one of many such announcements.

  • Threat Stack Inc. conducted an April analysis of AWS cloud usage and found widespread security misconfigurations that affected nearly three quarters of the more than 200 surveyed companies.
  • RedLock Inc. published a May research report that found many security problems were primarily due to user misconfigurations of public cloud platforms. AWS was prominently mentioned in the report.
  • Appthority published earlier this month investigation results showing that nearly 43 TB enterprise data was exposed via cloud back-ends. This includes personally identifiable information (PII).

UpGuard discovered the issue and created a post about “AWS S3 bucket provisioning.” The post stated that Amazon’s Simple Storage Service (S3) storage containers are known for being unlocked by the public, even by the largest companies in the world. If the bucket contained sensitive information such as customer lists, corporate databases, or large collections of sensitive information, this can lead to a major data breach. It has. Even though the misconfiguration is a simple permission, it can have devastating consequences. Vickery took several days to download 1.1 TB of data, which is roughly 500 hours of video. UpGuard stated that despite the severity of the breach, Vickery will likely be overwhelmed in the future. This could have a far more devastating effect if cyber resilience is not embraced by all Internet-facing systems.

Previous post AWS S3 Offers a Storage Option for Rarely Accessed Data Amazon Web Services (AWS), recently announced improvements to its Simple Storage Service (S3). These include an expansion of its Intelligent Tiering option to store archival data. AWS launched S3 Intelligent-Tiering in March 2012 to provide S3 users with a more cost-effective storage option for data that has unpredictable access requirements. There are two data tiers within the S3 Intelligent-Tiering option. One for data that is accessed frequently and another for data that is accessed less often. The service automatically moves data objects between tiers based on how frequently users request access. If an object isn’t accessed in 30 days, it is moved to the infrequent accessibility tier. Once it is accessed again, it is moved back to the frequent tier. This is a process that optimizes storage costs for data that is only occasionally, irregularly, or both. AWS announced this week an expansion to S3 Intelligent-Tiering. Archive Access and Deep Archive Access are now available to access data that is rarely accessed. The first tier is for data that hasn’t been accessed in the last 90 days and the second for data that hasn’t been accessed within 180 days. S3 Intelligent-Tiering can move data from one tier of the hierarchy to another as needed. Marcia Villalba is an AWS senior developer advocate and wrote a blog post about the cost benefits of S3 Intelligent-Tiering. You pay monthly storage, request, and data transfer. Intelligent-Tiering allows you to pay a small per-object fee each month for automation and monitoring. S3 Intelligent-Tiering does not charge a retrieval fee or charge for data movement between tiers. Objects in Frequent Access tier get billed at S3 Standard’s rate, while objects in Infrequent Access tier objects are billed at S3 Standard Infrequent Access. Objects stored in Archive Access tier objects are billed the same as S3 Glacier, and objects in Deep Archive access Tier objects are billed the same as S3 Deep Glacier. AWS also announced some other S3-related enhancements last week.
Next post Cloudflare: AWS S3 Storage Fees “Egregious”