Ensure Glue crawlers have schedules appropriate for data freshness needs
Crawler schedules should balance data freshness requirements with cost. Avoid unnecessary frequent crawling.
Security Impact
Overly frequent crawling wastes costs without providing value when data schema rarely changes.
How to Remediate
Review crawler schedules. Use event-driven crawling or less frequent schedules where real-time metadata updates aren't required.
Affected Resources
Compliance Frameworks
How TigerGate Helps
TigerGate continuously monitors your AWS environment to detect and alert on this misconfiguration. Here's what our platform does for this specific check:
- Continuous Scanning
Automatically scans all AWS Glue resources across your AWS accounts every hour
- Instant Alerts
Get notified via Slack, email, or webhooks when this misconfiguration is detected
- One-Click Remediation
Fix this issue directly from the TigerGate dashboard with our guided remediation
- Compliance Evidence
Automatically collect audit evidence for Cost Optimization compliance
- Drift Detection
Get alerted if this configuration drifts back to an insecure state after remediation
Check Details
- Check ID
- aws-glue-14
- Service
- AWS Glue
- Category
- Cost Optimization
- Severity
- LOW
Automate This Check
TigerGate automatically scans your AWS environment for this and 575+ other security checks.
Start Free TrialRelated AWS Glue Checks
View all checksEnsure Glue Data Catalog encryption at rest is enabled
Ensure Glue connection passwords are encrypted
Ensure Glue jobs have encryption for S3 data targets
Ensure Glue jobs have CloudWatch log encryption enabled
Ensure Glue jobs are configured with job bookmarks