Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[sdk] Provide better concurrency modes #5643

Closed
wants to merge 10 commits into from

Conversation

reyang
Copy link
Member

@reyang reyang commented May 22, 2024

This is following the same design as push/pull metrics exporter:

[AttributeUsage(AttributeTargets.Class, AllowMultiple = false, Inherited = true)]
public sealed class ExportModesAttribute : Attribute

Exporter (also processors, samplers, etc.) authors can optionally provide additional hints so the SDK can better serve the need:

[ConcurrencyModes(ConcurrencyModes.Multithreaded | ConcurrencyModes.Reentrant)]
internal class MyExporter : BaseExporter<LogRecord>
{
    ...
}

In addition, I envision that Console Exporters can leverage this to provide synchronization across multiple instances, so a log exporter and metrics exporter won't have race condition and cause garbled text in stdout:

[ConcurrencyModes(ConcurrencyModes.Global)]
internal class ConsoleExporter<T> : BaseExporter<T>
{
    ...
}

@reyang reyang requested a review from a team May 22, 2024 01:04
@@ -5,6 +5,7 @@
using OpenTelemetry;
using OpenTelemetry.Logs;

[ConcurrencyModes(ConcurrencyModes.Multithreaded | ConcurrencyModes.Reentrant)]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to show how the attributes can be used, once folks agree with the direction, I'll revert the example code and update the changelog.

Copy link

codecov bot commented May 22, 2024

Codecov Report

Attention: Patch coverage is 66.66667% with 7 lines in your changes missing coverage. Please review.

Project coverage is 85.67%. Comparing base (6250307) to head (c5b6dd1).
Report is 259 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #5643      +/-   ##
==========================================
+ Coverage   83.38%   85.67%   +2.29%     
==========================================
  Files         297      255      -42     
  Lines       12531    11051    -1480     
==========================================
- Hits        10449     9468     -981     
+ Misses       2082     1583     -499     
Flag Coverage Δ
unittests ?
unittests-Project-Experimental 85.62% <66.66%> (?)
unittests-Project-Stable 85.60% <66.66%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Coverage Δ
src/OpenTelemetry/SimpleExportProcessor.cs 85.71% <82.35%> (-14.29%) ⬇️
src/OpenTelemetry/ConcurrencyModesAttribute.cs 0.00% <0.00%> (ø)

... and 114 files with indirect coverage changes

@reyang reyang changed the title Provide better concurrency modes [sdk] Provide better concurrency modes May 22, 2024
/// Reentrant, the component can be invoked recursively without resulting
/// a deadlock or infinite loop.
/// </summary>
Reentrant = 0b1,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We removed Global for now should we also remove Reentrant? Doesn't seem to be used at the moment.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My gut feeling is no. Multithreaded and Reentrant are closely related, the existing implementation (which uses lock(obj)) implies that the exporter supports reentrancy but not multithreading (which is buggy IMHO).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No strong feelings though, I can scope this out and add it later once we figured out how to correctly handle reentrancy.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. So if the attribute is not specified and defaults to 0 our implied default is essentially SingleThreaded | Reentrant?

Should the defined Reentrant flag then negate the default behavior?

public enum ConcurrencyModes : byte
{
    /// <summary>
    /// Nonreentrant, the component cannot be invoked recursively without resulting
    /// in a deadlock or infinite loop.
    /// </summary>
    Nonreentrant = 0b1,

    /// <summary>
    /// Multithreaded, the component can be invoked concurrently across
    /// multiple threads.
    /// </summary>
    Multithreaded = 0b10,
}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. So if the attribute is not specified and defaults to 0 our implied default is essentially SingleThreaded | Reentrant?

Maybe. I personally consider this as a bug in the current SDK implementation.

Should the defined Reentrant flag then negate the default behavior?

I guess no, better to treat it as a bug, then figure out how to fix it in a non-breaking way.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My thinking - if the exporter doesn't have the attribute at all, fall back to the old (and buggy) behavior, if the exporter has the attribute, start to enforce the correct behavior.

@reyang reyang added the pkg:OpenTelemetry Issues related to OpenTelemetry NuGet package label May 24, 2024
Copy link
Contributor

github-actions bot commented Jun 5, 2024

This PR was marked stale due to lack of activity and will be closed in 7 days. Commenting or Pushing will instruct the bot to automatically remove the label. This bot runs once per day.

@github-actions github-actions bot added Stale Issues and pull requests which have been flagged for closing due to inactivity and removed Stale Issues and pull requests which have been flagged for closing due to inactivity labels Jun 5, 2024
Copy link
Contributor

This PR was marked stale due to lack of activity and will be closed in 7 days. Commenting or Pushing will instruct the bot to automatically remove the label. This bot runs once per day.

@github-actions github-actions bot added the Stale Issues and pull requests which have been flagged for closing due to inactivity label Jun 14, 2024
Copy link
Contributor

Closed as inactive. Feel free to reopen if this PR is still being worked on.

@github-actions github-actions bot closed this Jun 21, 2024
@eerhardt
Copy link
Contributor

@reyang @CodeBlanch - I see this got closed due to inactivity. Do we have a plan for removing the private reflection in the GenevaExporter?

@reyang
Copy link
Member Author

reyang commented Jun 21, 2024

@reyang @CodeBlanch - I see this got closed due to inactivity. Do we have a plan for removing the private reflection in the GenevaExporter?

@CodeBlanch would you follow up?

@CodeBlanch
Copy link
Member

@eerhardt Still working on the plan. I've been exploring some other possible APIs/alternatives. Hoping to have something to propose soon and then we'll pick a direction to have in place for the next release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pkg:OpenTelemetry Issues related to OpenTelemetry NuGet package Stale Issues and pull requests which have been flagged for closing due to inactivity
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants