How to retain data in Azure Log Analytics beyond t

2020-07-31 05:02发布

I am looking into options for retaining data beyond the 31 days that Azure Log Analytics supports. I tried reading the documentation, but I haven’t been able to figure it out yet. How should this be architected? Is the expectation that the logs should be archived to a cold Azure Storage account outside of this using some other method from the data source, or is there a way to route the parsed log data from Azure Analytics Logs to an Azure Storage account?

2条回答
祖国的老花朵
2楼-- · 2020-07-31 05:22

If you are not using a free tier, you could change Log Analytics data retention settings up to 730 days. Read more details here. enter image description here

Generally, you could do the following things with diagnostic logs.

  • Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using resource diagnostic settings.
  • Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as Power BI.
  • Analyze them with Azure Monitor, where the data is written immediately to Azure Monitor with no need to first write the data to storage.

Depend on your usage, you could select to store these logs in a storage account. Here is a sample Powershell script to show how to convert Storage Analytics log data to JSON format and post the JSON data to a Log Analytics workspace.

References: Query Azure Storage analytics logs in Azure Log Analytics

查看更多
看我几分像从前
3楼-- · 2020-07-31 05:32

I prefer summarized log should be stored outside of Log Analytics. Can we cascade the query result to backup to other cold storage like DataLake, DataBricks, ADX ? So VM > Monitoring > KQLQuery > Storage path is more flexible. And it might realize consistent summary between monitoring (using KQL & dashboard) in log analytics and long term analytics (using DataLake etc...) outside.

And of course support of automation by Azure Data Factory or LogicApps are highly appreciated.

查看更多
登录 后发表回答