Add observability to your CloudQuery sync runs with Datadog
Prerequisites #
- A Datadog account. See here for getting started
- The Datadog agent installed in the machine(s) running the CloudQuery sync jobs. See here for more information
- Log collection enabled in the Datadog agent configuration. See here for more information
Adding a Custom Log Source for CloudQuery Logs #
--log-format json
flag to the CloudQuery sync command. This ensures that the logs are in a structured format that Datadog can parse.
Datadog can collect logs in non JSON format as well, but using JSON format ensure that we can query the logs.- Locate the Datadog
conf.d
directory. You can find the location of the directory based on our OS in the Datadog documentation here. - Under the
conf.d
directory create a directory namedcloudquery.d
- Under the
cloudquery.d
directory create a file namedconf.yaml
with the following content:
logs:
- type: "file"
path: "<full path to the CloudQuery log file>" # Example path: /var/log/cloudquery/cloudquery.log
service: "CloudQuery"
source: "CloudQuery"
- Restart the Datadog agent for changes to take effect.
You can find more information on setting up a custom log source in Datadog here
Using Datadog to query CloudQuery logs #
@errors:>0 @table:*
query filters all tables that had errors during the sync job:
@resources:>10000 @table:*
query filters all tables that had more than 10,000 resources during the sync job:
Summary and Next Steps #
Written by Erez Rokah
I'm a security oriented open source maintainer. I joined the CloudQuery team in April 2022 to focus on building a developer first, open source, high performance data integration platform for security and infrastructure teams.