Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unwanted traces are logged when using Arcus.Observability in Azure Functions #359

Open
fgheysels opened this issue Mar 31, 2022 · 22 comments
Assignees
Labels
application-insights All issues related to Azure Application Insights bug Something isn't working tracing All issues related to tracing
Milestone

Comments

@fgheysels
Copy link
Member

I have setup logging to Application Insights using Arcus.Observability and Serilog like this:

internal class Startup : FunctionsStartup
{
    public override void Configure(IFunctionsHostBuilder builder)
    {
        var configuration = builder.GetContext().Configuration;
               
        builder.Services.AddLogging(loggingBuilder => ConfigureLogging(loggingBuilder, configuration));        
    }

    private static void ConfigureLogging(ILoggingBuilder builder, IConfiguration configuration)
    {
        var logConfiguration = new LoggerConfiguration()
                               .MinimumLevel.Information()
                               .Enrich.FromLogContext()
                               .Enrich.WithComponentName("my-function")
                               .WriteTo.Console();

        var telemetryKey = configuration.GetValue<string>("APPINSIGHTS_INSTRUMENTATIONKEY");

        if (!String.IsNullOrWhiteSpace(telemetryKey))
        {
            logConfiguration.WriteTo.AzureApplicationInsights(telemetryKey);
        }

        builder.AddSerilog(logConfiguration.CreateLogger());
}

This works as expected, but there are some weird things to be noticed:

  • The traces table in Application Insights contains (null) entries:

    image

  • Some traces are written twice to Application Insights, albeit slightly different (look at the quotes)
    image

@stijnmoreels
Copy link
Member

Hmm, I don't know if it got anything to do with it, or if it's even relevant: we usually move the logging configuration outside the delegate so that the config is created at startup time and not at runtime, when it's requested.

@stijnmoreels stijnmoreels added bug Something isn't working application-insights All issues related to Azure Application Insights tracing All issues related to tracing labels Apr 6, 2022
@stijnmoreels stijnmoreels added this to the v2.5.0 milestone Apr 6, 2022
@fgheysels
Copy link
Member Author

Hmm, I don't know if it got anything to do with it, or if it's even relevant: we usually move the logging configuration outside the delegate so that the config is created at startup time and not at runtime, when it's requested.

How do you mean exactly ?

@fgheysels
Copy link
Member Author

I am also wondering if other people have seen this behavior as well.

@stijnmoreels
Copy link
Member

Hmm, I don't know if it got anything to do with it, or if it's even relevant: we usually move the logging configuration outside the delegate so that the config is created at startup time and not at runtime, when it's requested.

How do you mean exactly ?

Instead of:

builder.Services.AddLogging(loggingBuilder =>
{
    LoggerConfiguration logConfig = ...
    ...
});

We do this in the template:

LoggerConfiguration logConfig = ...
builder.Services.AddLogging(loggingBuilder =>
{
    ...
});

I was looking at the traces that were 'different' from the rest, and it got me thinking that maybe there was a 'delay' or something that made the traces look the way they do.

@stijnmoreels
Copy link
Member

Plus, maybe this #364 is also related to this. Maybe we should remove the ApplicationInsightsLogger for this to work correctly.

@stijnmoreels
Copy link
Member

Plus, maybe this #364 is also related to this. Maybe we should remove the ApplicationInsightsLogger for this to work correctly.

Thinking more and more that this could also be a cause. We should have a streamlined approach for Azure Functions that work with Arcus Observability (and not rely on anything no there, like built-in support) for service-to-service correlation, probably.

@fgheysels
Copy link
Member Author

This indeed sounds reasonable. Can you investigate this further pls ?

@stijnmoreels
Copy link
Member

This indeed sounds reasonable. Can you investigate this further pls ?

Looked a bit deeper and checked if we still have those strange message = null messages when the ApplicationInsightsLoggingProvider of Microsoft isn't there anymore, and that's the case. So, in conclusion: if we continue with the #364 where we will discus how we can remove this built-in provider, we will also resolve this issue. Also, we could already close this as it's actually not Arcus that's the cause.

@stijnmoreels
Copy link
Member

We can close this, right? As we have #364 that will make sure that we don't have Microsoft's stuff anymore.

@stijnmoreels
Copy link
Member

Closing for now as the #364 is solving this for us now.

@fgheysels
Copy link
Member Author

fgheysels commented Feb 14, 2023

It seems that this issue is still relevant.
I've received a report that this is still happening in 'isolated' functions. (In a function created using the Arcus Azure Function Http Trigger template).
Some entries are being logged twice.

@stijnmoreels
Copy link
Member

We'll need some information on what, where, version, and how the logging and telemetry is set up.

@Pauwelz
Copy link

Pauwelz commented Feb 14, 2023

I started from the Arcus Azure Functions with Http trigger v1.1.0
dotnet new arcus-az-func-http -o arcus-az-func-http -fw isolated -ih false -eo true

Added my instrumentation key in local.settings.json and just tried to send a default request.
In the Console everything looks fine:
image

In the transaction search I see the following:
image

@Pauwelz
Copy link

Pauwelz commented Feb 14, 2023

This was also reported here serilog-contrib/serilog-sinks-applicationinsights#157.
What resolved the issue for me was replacing the APPLICATIONINSIGHTS_CONNECTION_STRING, disabling the default logger.

@fgheysels
Copy link
Member Author

But, when you remove the APPLICATIONINSIGHTS_CONNECTION_STRING configuration setting, the Function will also not log any requests anymore ?
As in: you don't see the function invocations anymore in the 'performance' blade in App Insights. See also here arcus-azure/arcus.templates#785

@stijnmoreels
Copy link
Member

@Pauwelz @fgheysels , just did the test myself with a freshly created template and indeed saw these duplicate lines.
These are all in the Host.Function category, right? We could disable those in the host.json.

{
  "logging": {
    "logLevel": {
      "Host.Function": "None"
    }  
  }
}

@fgheysels
Copy link
Member Author

@tleblanc159 posted a possible solution here:

serilog-contrib/serilog-sinks-applicationinsights#157 (comment)

We should definitely check this out

@tleblanc159
Copy link

@fgheysels

I posted that solution here at first but wasn't sure if it really fit this issue. Afaik, the Arcus HTTP Triggered In-Process Function App template uses an extension method provided by Arcus to completely remove the Azure WebJobs Host provided ApplicationInsightsLoggerProvider. The solution I posted is a way to interop with that instead of completely removing it.

To me, it seems like interop-ing would defeat the purpose of using Arcus since Arcus would no longer be in control of all logs produced by the In Process Function App (which kinda seems like the whole point of using Arcus.Observability). At that point, you may as well use the Serilog Application Insights sink instead, like in the solution of mine you linked.

@stijnmoreels
Copy link
Member

The Arcus Serilog Application Insights sink does more than only pass logs to Application Insights; it makes sure that you can write telemetry (requests, dependencies, metrics, events) right from the logging framework and let's you include contextual information.
Interop'ting here is indeed something we should look at, as it is also doing that in other environments (Web API, Messaging). It is only due to the tedious Azure Functions framework that we come across this problem.

@tleblanc159
Copy link

tleblanc159 commented Feb 23, 2023

The Arcus Serilog Application Insights sink does more than only pass logs to Application Insights; it makes sure that you can write telemetry (requests, dependencies, metrics, events) right from the logging framework and let's you include contextual information. Interop'ting here is indeed something we should look at, as it is also doing that in other environments (Web API, Messaging). It is only due to the tedious Azure Functions framework that we come across this problem.

I see, that makes a lot more sense now. I misunderstood the use case of Arcus.Observability.

The solution I posted should be extensible to Serilog using the Arcus App Insights sink. Could even take it a step further and create a custom implementation of ILoggerFactory that doesn't allow the Az Func Host App Insights LoggerProveder to be added to the factory, allow adding of all registered LoggerProviders + any more that you specify (anything that logs to App Insights), remove reliance on LoggerFactory class since it might already be registered and used in an incompatible way. Maybe just an extension method on ILoggerBuilder (if one for Arcus doesn't exist already) could handle most of that and then be used to configure the LoggerBuilder instance. Or both.

@stijnmoreels
Copy link
Member

stijnmoreels commented Mar 1, 2023

Seems like the proposed way to do Application Insights in isolated Azure Functions, is with their dedicated Application Insights package: Microsoft.Azure.Functions.Worker.ApplicationInsights. The problem, though, is that this is still in preview.
We do not seem to be the only ones struggling with this: Azure/azure-functions-dotnet-worker#760

Can you try to add this, @fgheysels , alongside our Arcus stuff, and see what comes up? You'll have to use the SerilogLoggerFactory as registered ILoggerFactory instead of the current Serilog setup. Automatic-dependency tracking will not be available, but it will at least correlate back the Arcus logging.
I guess we have to wait on this (like others), or make sure that we remove the automatic Application Insights package from our Azure Functions package and use the preview package in our templates.

@stijnmoreels
Copy link
Member

Conclusion: create preview package of Arcus.WebApi.Logging.AzureFunctions that uses this preview Application Insights package so that we can fix the correlation problem on isolated Azure Functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
application-insights All issues related to Azure Application Insights bug Something isn't working tracing All issues related to tracing
Projects
None yet
Development

No branches or pull requests

4 participants