-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
log4net input should include global and logical thread context data #287
Comments
@teeboy75 thanks for submitting this issue First, it was not clear in the documentation, but the Also, I would suggest to use StdOutput to determine whether the problem is that the Log4net is not capturing the NDC context data, or is it a problem with Elasticsearch output. Can you please do another experiment and:
and let us know if that causes NDC data to show up? Thx! |
Added stdout and removed additional log levels from eventflowconfig. Does not make any difference. NDC output does not show up on console window and ELK. The only additional Json enriched fields that seems to be sent to elastic are "timestamp", "providernam","level" and "keywords". Ofcourse, "payload" element contains just the text "Hey! Listen!". Here is the json. |
Just to be clear, if I remove eventflow input and output and directly use log4nets log appenders, the NDC comment works.. |
OK, thanks. I looked a bit more closely into this and it looks like appending data from Log4net global context & logical thread context was never implemented. For comparison here is how Stackify Retrace appender does this @teeboy75 would you be willing to submit a PR to have these properties added? |
Sure! I can do it late next week.. |
Thank you! Much appreciate your help @teeboy75 |
Adding support for global and thread context stacks for log4net. This is for Azure#287
@karolz-ms |
@teeboy75 Roger! |
@pattisapu01 Do you mind if we (myself or some other Microsoft developer) "steal" the code from you (pattisapu01@c9c4d65) and finish this PR? I'd be happy if your time and work on this issue did not go to waste... |
Absolutely! Unfortunately, the company I work for is not moving quickly. Thank you for getting back to me.
Regards
Prakash
Get Outlook for Android<https://aka.ms/ghei36>
…________________________________
From: Karol Zadora-Przylecki <[email protected]>
Sent: Tuesday, March 26, 2019 11:59:40 AM
To: Azure/diagnostics-eventflow
Cc: pattisapu01; Mention
Subject: Re: [Azure/diagnostics-eventflow] Eventflow output to ELK with log4net input and NDC stack (#287)
@pattisapu01<https://github.com/pattisapu01> Do you mind if we (myself or some other Microsoft developer) "steal" the code from you (pattisapu01@c9c4d65<pattisapu01@c9c4d65>) and finish this PR? I'd be happy if your time and work on this issue did not go to waste...
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#287 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AUPozVWDp5GD4-t7Z3aPqccaRsXcT0Lvks5vakPsgaJpZM4YfvOB>.
|
I have a simple eventflow setup writing to elasticsearch with log4net input
eventflowconfig.json
"inputs": [
{
"type": "Log4net",
"logLevel": "Debug"
},
{
"type": "Log4net",
"logLevel": "Info"
},
{
"type": "Log4net",
"logLevel": "Warn"
},
{
"type": "Log4net",
"logLevel": "Error"
},
{
"type": "Log4net",
"logLevel": "Fatal"
}
],
"outputs": [
{
"type": "ElasticSearch",
"indexNamePrefix": "defaultindex",
"serviceUri": "http://servername:9200",
"basicAuthenticationUserName": "user",
"basicAuthenticationUserPassword": "password",
"eventDocumentTypeName": "diagData",
"numberOfShards": 1,
"numberOfReplicas": 0,
"refreshInterval": "5s"
}
],
"settings": {
"pipelineBufferSize": "1000",
"maxEventBatchSize": "100",
"maxBatchDelayMsec": "500",
"maxConcurrency": "8",
"pipelineCompletionTimeoutMsec": "30000"
},
"extensions": []
}
The issue is, the custom data [12345 in this case] I write to the NDC context, is not written to elasticsearch. How do I configure elastic output to recognize any custom data pushed to NDC?
The debug message along with the "exception" message does get written to ELK
The text was updated successfully, but these errors were encountered: