-->

working azure logging setup

2019-04-04 06:18发布

问题:

I've been trying to setup trace logging for a while and I simply can't get it to work properly. It doesn't help they are so many wrong/outdated articles on this topic. But please can someone give me a good and practical setup for trace logging & viewing for Azure (1.6)

All I want to do is the ability to capture and view trace messages from my app.

I started out with standard DiagnosticMonitorTraceListener but that ends up in table storage. I can't for the life of me figure out how I'm supposed to interact with the Logs in table storage. In Visual Studio I can 'view' it but it's so extremely cumbersome to use it it's practicaly useless. No sorting, have to write cumbersome date filters which half of the time don't work.

Custom logs seem the way to go. I have worked a lot with log4net, so I picked that one. You can redirect log4net to trace but then you end up with the same crappy table storage. Custom log file it is. Now I'm already confused if this is supported or not. Some articles mention diagnostic file locks causing all sorts of issues. Not sure if this is still a problem, regardless it's weird, why give custom-log transfer capability when you can't read/write the logs?! Anyway I haven't had any issues writing to the log (that I have noticed).

Set up is according to the MSDN articles (extra vague and very spread out btw). Define LocalStorage element in ServiceDefinition (128Mb). Add directory log transfer in role startup. Go. This seems to work. Until after a while the role craps itself during a restart with a OverallQuota is not big enough message and the role just dies and refuses to come up. There is sooo much space available even within the 4080Mb, this simply does not make sense.

Again followed articles on increasing the quota, but those seemed to make matters worse. Set the DiagnosticStore size to 8Gb in ServiceDefinition. Does not work. Still craps out, only with the higher number. Setting the OverallQuota to be equal to 8Gb as well does not help either. For some reason an install on a clean image works fine, but when there is a restart or an update it decides to calculate the quota's differently. Regardless of of the size of the DiagnosticsStore, the 'calculated' value is always OverallQuota + Log4Net LocalStorage. Nothing I do seems to change that. Extremely frustrating as it's seems to work, only to die sometime later.

I've also tried diagnostics.wadcfg, but couldn't get Azure to pick them up. I made sure they were copied to root output folder and removed any changes to the monitor from my code. Nada, zip... went through all the log files I could find on the instance. Not a single mention or error anywhere.

Why is this so hard on Azure? Trace logs are the most basic logging tool for any application. This is really killing the Azure experience.

回答1:

We're using Log4Net on Azure as well

Minor warning, when running multiple roles per instance, I've only been able to get one role [main role] to actually successfully write logs... [not good!!!]

How to set it up... Easy as.

in Global.asax as per usual configure log4net

protected void Application_Start()
{
    log4net.Config.XmlConfigurator.Configure();
}

in your role entry point... run the configuration as well. F#*^ knows why, but if you don't do it in both places it won't work (well not in my case anyway)

public override void Run()
{
    log4net.Config.XmlConfigurator.Configure();
}

Then in your config files

<log4net>
    <appender name="TraceAppender" type="log4net.Appender.TraceAppender">
        <layout type="log4net.Layout.PatternLayout">
            <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
        </layout>
    </appender>
    <root>
        <level value="ALL" />
        <appender-ref ref="TraceAppender" />
    </root>
</log4net>

Make sure you have Azure tracing enabled

<system.diagnostics>
     <trace>
          <listeners>
              <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics">
                  <filter type="" />
             </add>
          </listeners>
     </trace>

We opted not to use Storage Studio reason being, any data coming out of Azure you pay for... data transactions within Azure is free, hooking into Table storage is easy as pie, so we built a screen to show logs, took 2/3 hours works like a bomb



回答2:

We're using log4net quiet successfully with Azure, however we are dumping data into Azure Table Storage. It is definitely a more scale-able solution. I highly recommend that you purchase a license for Cerebrata's (now RedGates) Storage Studio to ease the pain of dealing with Azure Table Storage and their Diagnostics Manager to ease the pain of looking at trace logs.



回答3:

For scenarios where there are large amounts of log data, we have tried directed the data into Blobs - partly using the using EtwTraceListener, as well as send critical pieces of information that we need to act on to Table Storage. We use both the AzureStorageTraceListener as well as created a custom TraceListener that directs the data to our own Azure Table - because the OOB schema did not meet our requirements. Because WADLogsTable can grow quickly and we need a way to trim it without incurring a huge Storage Txn cost or taking Tracing offline while we finish deleting/recreating the Table, we use our custom table which creates a different table for each month.

Ranjith
http://www/opstera.com



回答4:

I use custom logging to write log data out to Azure table storage for all my sites. However, like you rightly point out, retrieving the data from the tables is the awkward bit. Therefore I also have a local internal-use only website running on our premises which retrieves the data from the Azure tables and stores it in a local database. Although you do pay to transfer data out of the Azure system, the costs are miniscule. Our logging solution for all our websites costs us less than £1 per month.

Having the data moved into a local database means that it is infintely more queryable than keeping it in the Azure system. I maintain a discussion on Azure logging which discusses a lot of the issues above and also provides links out to code examples.