Skip to content

Commit

Permalink
[Paid Bounty] Upgrade MonoGame to use BasisUniversal for cross platfo…
Browse files Browse the repository at this point in the history
…rm Texture Compression (MonoGame#8456)

To briefly restate the paid bounty, MonoGame's texture compression tooling does not work on ARM based CPUs (like the recent Apple M) chips. The goal of the bounty was to restore functionality on ARM based CPUs. The actual implementation detail of the bounty was to swap out old texture compression tools (like PVRTexLibNET and NVTT) with a new command-line utility called [Basis Universal](https://github.com/BinomialLLC/basis_universal). This PR completes the cross-platform goal of the bounty, and introduces some implementation _beyond_ what was originally documented in the paid bounty. 

Here is a quick tldr of what is in the PR, and more details below.
- Content Pipeline Context
- ETC2 Support
- ATSTC Support
- Dotnet tool references to `mgcb-basisu` and `mgcb-crunch`
- Nuget library references `ktxLoader` and `bcnEncoder`
- Removal of PVRTex and NVTT


To begin, I ripped out the PVRTex and NVTT libraries. Immediately, the following classes (and in some cases, their sub-classes) had compile errors, 
- `DxtBitmapContent`, 
- `AtcBitmapContent`, 
- `Etc1BitmapContent`, 
- `PvrtcBitmapContent`

From those 4 base types, _several_ `SurfaceFormat` types were no longer compressible. However, at this point, I find it helpful to remember that not all `SurfaceFormat` values were actually selectable by users. In the Content Builder, the users select a `TextureProcessorOutputFormat` value which _hints_ which `SurfaceFormat` to use, but doesn't strictly enforce it. For example, the user may select `TextureProcessorOutputFormat.DxtCompressed`, but exactly which DXT `SurfaceFormat` is selected is left to the Content Builder code (specifically, the `GraphicsUtil.CompressDxt` function)

There are a cascade of `SurfaceFormat`s that broke, and unfortunately, _Basis Universal_ doesn't actually cover all of the formats. _BasisU_'s format list is [here](https://github.com/BinomialLLC/basis_universal/blob/ad9386a4a1cf2a248f7bbd45f543a7448db15267/transcoder/basisu_transcoder.h#L49). 

To cover the formats that weren't supported directly by _BasisU_, this PR also includes the external command line utility, [Crunch](https://github.com/MonoGame/MonoGame.Tool.Crunch), as well as a nuget library for [BcnEncoder](https://github.com/Nominom/BCnEncoder.NET). 

Here is a table of all the formats that broke after I removed the old texture compression tools, if they're still supported, and if so, what tool is doing the compression. 

| Format | Still Supported | Tool | Note |
| - | - | - | - |
| DXT1 | yes | BcnEncoder | | 
| DXT1a | yes | BcnEncoder | | 
| DXT3 | yes | Crunch | BcnEncoder produces bad results for textures with alpha for dxt3 only | 
| DXT5 | yes | BcnEncoder | | 
| RgbaAtcExplicitAlpha | yes | BcnEncoder | | 
| RgbaAtcInterpolatedAlpha | yes | BcnEncoder | | 
| RgbaAtcInterpolatedAlpha | yes | BcnEncoder | | 
| RgbEtc1 | yes | Crunch | | 
| ETC2 | yes | Crunch | this PR adds support for ETC2 compression, which previously was not available | 
| ASTC_4x4_Rgba | yes | BasisU | this PR adds support for ASTC compression, as well as the SurfaceFormat value | 
| RgbPvrtc4Bpp | yes | BasisU | | 
| RgbaPvrtc4Bpp | yes | BasisU | | 
| RgbPvrtc2Bpp | no | - | | 
| RgbaPvrtc2Bpp | no | - | | 

Alarm bells may be going off in your brain for the lack of `RgbPvrtc2Bpp` and `RgbaPvrtc2Bpp` support. However, the correlated subtypes of `PvrtcBitmapContent` for those 2Bpp variants, `PvrtcRgb2BitmapContent`, and `PvrtcRgba2BitmapContent` had _no_ usages in the codebase. The `TextureProcessorOutputFormat.PvrCompressed` option would always lead users to the _4Bpp_ variant.  This PR marks the 2Bpp variants with the `[Obsolete]` tag but does not delete them. _BasisU_ does not support the 2Bpp variant, but since the types were never used, I felt it was okay to deprecate them. 

It turns out that _BasisU_ is actually only used for a _relatively_ small subset of the formats. The _BcnEncoder_ library is handling several formats. The library is much faster an transcoding the textures because it is running within the same dotnet process and does not need to generate any intermediate files. The _BcnEncoder_ library is a nuget package, and comes with a few dependencies. Comparatively, the _Crunch_ and _BasisU_ tools are invoking separate processes on the machine to handle the compression, and those processes require data to be passed via file, which requires the Content Builder to do file reads&writes. In both cases, I'm using another library, _KtxLoader_, to read the compressed byte arrays. 

The file requirement is the reason for the newly added `ContextScopeFactory` in this PR. The current architecture is set up so that the compression is triggered from within a `BitmapContent`'s public API, but since those types are originally XNA types, I cannot change their public API to include any contextual information about where to generate the intermediate files. Instead of modifying the public API, I've introduce a new concept into the Content Builder, called a _Context Scope_. Essentially, every time some content is being processed, it creates a `static`-ally available view of the contextual information, which is used to infer the directory where intermediate files should be created/destroyed.  

As noted in the table above, this PR adds support for `ETC2`. I've deprecated the `TextureProcessorOutputFormat.Etc1Compressed` option, and added a more generalized `TextureProcessorOutputFormat.EtcCompressed` option. This new option will use ETC2 compression if the source texture is found to have non opaque alpha values. Otherwise, ETC1 is still used. 

Also, this PR adds support for ASTC. To support this, I created a brand new entry in the `TextureProcessorOutputFormat` enum, and the `SurfaceFormat` enum. 

Basis Universal was packaged into a custom [monogame tool](https://github.com/MonoGame/MonoGame.Tool.BasisUniversal), and published to [nuget](https://www.nuget.org/packages/mgcb-crunch). 

The original paid bounty issue can be found here, 
MonoGame#8419

And I had my draft PR against my personal fork of MonoGame. A fair amount of development discussion took place on that thread, so feel free to review it.
cdhanna#1

Known Issues
- MultiThreaded context issue?
   - I need to do a bit digging into the `ContextScopeFactory` and make sure it works in a possibly multi-threaded scenario. I was testing it via Unit Tests. 
- How do the users get access to the dotnet tools `mgcb-crunch` and `mgcb-basisu`?
   - For development, there is a local `.config/dotnet-tools.json` file which makes the tools available to use during Unit Tests. But its unclear to me how these tools get resolved in an actual production use case. I think there is a plan amongst the maintainers of MonoGame, but I don't fully understand it (yet). 
   
   
  ----

I've enjoyed working on this, and I've learned _a lot_. Thanks to all those who have helped out and answered my questions :) 

Co-authored-by: Chris Hanna <[email protected]>
  • Loading branch information
cdhanna and chrisbeamable authored Sep 13, 2024
1 parent 584eaa2 commit 7a398b0
Show file tree
Hide file tree
Showing 28 changed files with 1,924 additions and 308 deletions.
2 changes: 1 addition & 1 deletion MonoGame.Framework.Content.Pipeline/AssemblyInfo.cs
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
using System.Runtime.CompilerServices;

[assembly:InternalsVisibleTo("MonoGame.Effect")]

[assembly:InternalsVisibleTo("MonoGame.Tools.Tests")]
38 changes: 22 additions & 16 deletions MonoGame.Framework.Content.Pipeline/Builder/PipelineManager.cs
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,8 @@ private struct ProcessorInfo
/// </summary>
public bool CompressContent { get; set; }

/// <summary>
/// If true exceptions thrown from within an importer or processor are caught and then
/// <summary>
/// If true exceptions thrown from within an importer or processor are caught and then
/// thrown from the context. Default value is true.
/// </summary>
public bool RethrowExceptions { get; set; }
Expand All @@ -108,7 +108,7 @@ public PipelineManager(string projectDir, string outputDir, string intermediateD

RegisterCustomConverters();

// Load the previous content stats.
// Load the previous content stats.
ContentStats = new ContentStatsCollection();
ContentStats.PreviousStats = ContentStatsCollection.Read(intermediateDir);
}
Expand Down Expand Up @@ -157,9 +157,9 @@ private void ResolveAssemblies()
try
{
Assembly a;
if (string.IsNullOrEmpty(assemblyPath))
a = Assembly.GetExecutingAssembly();
else
if (string.IsNullOrEmpty(assemblyPath))
a = Assembly.GetExecutingAssembly();
else
a = Assembly.LoadFrom(assemblyPath);

exportedTypes = a.GetTypes();
Expand All @@ -182,7 +182,7 @@ private void ResolveAssemblies()

foreach (var t in exportedTypes)
{
if (t.IsAbstract)
if (t.IsAbstract)
continue;

if (t.GetInterface(@"IContentImporter") != null)
Expand Down Expand Up @@ -242,7 +242,7 @@ public Type[] GetImporterTypes()

List<Type> types = new List<Type>();

foreach (var item in _importers)
foreach (var item in _importers)
{
types.Add(item.type);
}
Expand All @@ -254,14 +254,14 @@ public Type[] GetProcessorTypes()
{
if (_processors == null)
ResolveAssemblies();

List<Type> types = new List<Type>();
foreach (var item in _processors)

foreach (var item in _processors)
{
types.Add(item.type);
}

return types.ToArray();
}

Expand Down Expand Up @@ -535,7 +535,7 @@ public PipelineBuildEvent BuildContent(string sourceFilepath, string outputFilep
{
sourceFilepath = PathHelper.Normalize(sourceFilepath);
ResolveOutputFilepath(sourceFilepath, ref outputFilepath);

ResolveImporterAndProcessor(sourceFilepath, ref importerName, ref processorName);

// Record what we're building and how.
Expand Down Expand Up @@ -570,12 +570,12 @@ private void BuildContent(PipelineBuildEvent pipelineEvent, PipelineBuildEvent c
// Keep track of all build events. (Required to resolve automatic names "AssetName_n".)
TrackPipelineBuildEvent(pipelineEvent);

var rebuild = pipelineEvent.NeedsRebuild(this, cachedEvent);
var rebuild = pipelineEvent.NeedsRebuild(this, cachedEvent);
if (rebuild)
Logger.LogMessage("{0}", pipelineEvent.SourceFile);
else
Logger.LogMessage("Skipping {0}", pipelineEvent.SourceFile);

Logger.Indent();
try
{
Expand Down Expand Up @@ -604,7 +604,7 @@ private void BuildContent(PipelineBuildEvent pipelineEvent, PipelineBuildEvent c
Parameters = assetCachedEvent.Parameters,
};

// Give the asset a chance to rebuild.
// Give the asset a chance to rebuild.
BuildContent(depEvent, assetCachedEvent, assetEventFilepath);
}
}
Expand Down Expand Up @@ -665,7 +665,10 @@ public object ProcessContent(PipelineBuildEvent pipelineEvent)
{
try
{

var importContext = new PipelineImporterContext(this, pipelineEvent);
using var _ = ContextScopeFactory.BeginContext(importContext, pipelineEvent);

importedObject = importer.Import(pipelineEvent.SourceFile, importContext);
}
catch (PipelineException)
Expand All @@ -684,6 +687,7 @@ public object ProcessContent(PipelineBuildEvent pipelineEvent)
else
{
var importContext = new PipelineImporterContext(this, pipelineEvent);
using var _ = ContextScopeFactory.BeginContext(importContext, pipelineEvent);
importedObject = importer.Import(pipelineEvent.SourceFile, importContext);
}

Expand Down Expand Up @@ -714,6 +718,7 @@ public object ProcessContent(PipelineBuildEvent pipelineEvent)
try
{
var processContext = new PipelineProcessorContext(this, pipelineEvent);
using var _ = ContextScopeFactory.BeginContext(processContext);
processedObject = processor.Process(importedObject, processContext);
}
catch (PipelineException)
Expand All @@ -732,6 +737,7 @@ public object ProcessContent(PipelineBuildEvent pipelineEvent)
else
{
var processContext = new PipelineProcessorContext(this, pipelineEvent);
using var _ = ContextScopeFactory.BeginContext(processContext);
processedObject = processor.Process(importedObject, processContext);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,9 @@ public override TOutput Convert<TInput, TOutput>( TInput input,
{
var processor = _manager.CreateProcessor(processorName, processorParameters);
var processContext = new PipelineProcessorContext(_manager, new PipelineBuildEvent { Parameters = processorParameters } );
using var _ = ContextScopeFactory.BeginContext(processContext);
var processedObject = processor.Process(input, processContext);

// Add its dependencies and built assets to ours.
_pipelineEvent.Dependencies.AddRangeUnique(processContext._pipelineEvent.Dependencies);
_pipelineEvent.BuildAsset.AddRangeUnique(processContext._pipelineEvent.BuildAsset);
Expand Down
218 changes: 218 additions & 0 deletions MonoGame.Framework.Content.Pipeline/ContextScopeFactory.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,218 @@
// MonoGame - Copyright (C) MonoGame Foundation, Inc
// This file is subject to the terms and conditions defined in
// file 'LICENSE.txt', which is part of this source code package.

using System;
using System.Collections.Generic;
using System.Threading;
using Microsoft.Xna.Framework.Content.Pipeline;
using MonoGame.Framework.Content.Pipeline.Builder;

namespace MonoGame.Framework.Content
{

/// <summary>
/// The <see cref="IContentContext"/> represents some sort of context operation's context.
/// Usually, this represents a <see cref="ContentImporterContext"/>
/// or <see cref="ContentProcessorContext"/>.
///
/// However, because those types are part of the XNA namespace, they cannot be modified directly.
/// This interface is an adapter over those types.
/// </summary>
internal interface IContentContext : IDisposable
{
/// <inheritdoc cref="ContentImporterContext.IntermediateDirectory"/>
public string IntermediateDirectory { get; }

/// <inheritdoc cref="ContentImporterContext.Logger"/>
public ContentBuildLogger Logger { get; }

/// <inheritdoc cref="ContentProcessorContext.SourceIdentity"/>
public ContentIdentity SourceIdentity { get; }
}

/// <summary>
/// <para>
/// The <see cref="ContextScopeFactory"/> facilitates access to a <see cref="IContentContext"/>
/// instance without direct access to the actual context.
/// </para>
///
/// <para>
/// Anytime a content context operation is about to start, the operation should signal
/// the <see cref="BeginContext(IContentContext)"/> method.
/// <b> Critically </b>, the content operation must <b>dispose</b> the resulting context
/// when the operation has concluded.
/// </para>
///
/// <para>
/// The most recent content operation can be retrieved with the <see cref="ActiveContext"/>
/// property.
/// </para>
///
/// </summary>
internal static class ContextScopeFactory
{
private static AsyncLocal<List<IContentContext>> _contextStack = new AsyncLocal<List<IContentContext>>
{
Value = new List<IContentContext>(1)
};
private static AsyncLocal<IContentContext> _activeContext = new AsyncLocal<IContentContext>();

/// <summary>
/// Returns true when the <see cref="ActiveContext"/> is a valid <see cref="IContentContext"/> instance.
/// </summary>
public static bool HasActiveContext => _activeContext.Value != null;

/// <summary>
/// Access the latest <see cref="IContentContext"/> operation.
/// If no operations are running (and therefor there is no context), this
/// accessor will throw a <see cref="PipelineException"/>.
///
/// Use the <see cref="HasActiveContext"/> to check if there is an active context.
///
/// <para>
/// Each Task-chain may have its own unique ActiveContext, but if a task-chain
/// does not have an active context, then the parent task's active context will be used
/// recursively. This is the behaviour of AsyncLocal.
/// </para>
/// </summary>
/// <exception cref="PipelineException"></exception>
public static IContentContext ActiveContext
{
get
{
if (!HasActiveContext)
{
throw new PipelineException(
$"Cannot access {nameof(ActiveContext)} because there is no active context. Make sure that {nameof(ContextScopeFactory)}.{nameof(BeginContext)} has been called with the `using` keyword");
}

return _activeContext.Value;
}
}

/// <summary>
/// Start a <see cref="ContentProcessorContext"/> operation.
/// The <see cref="ContentProcessorContext"/> instance will be adapted into a
/// <see cref="IContentContext"/>
///
/// </summary>
/// <param name="context"></param>
/// <returns>
/// <b>this return value must be disposed when the context operation is complete!</b>
/// </returns>
public static IContentContext BeginContext(ContentProcessorContext context)
{
return BeginContext(new ContentProcessorContextAdapter(context));
}


/// <summary>
/// Start a <see cref="ContentImporterContext"/> operation.
/// The <see cref="ContentImporterContext"/> instance will be adapted into a
/// <see cref="IContentContext"/>
///
/// </summary>
/// <param name="context"></param>
/// <param name="evt">
/// A <see cref="ContentImporterContext"/> does not include a source file,
/// but the originating <see cref="PipelineBuildEvent"/> <i>does</i>. The event
/// will be used to fulfill the <see cref="IContentContext.SourceIdentity"/> value.
/// </param>
/// <returns>
/// <b>this return value must be disposed when the context operation is complete!</b>
/// </returns>
public static IContentContext BeginContext(ContentImporterContext context, PipelineBuildEvent evt)
{
return BeginContext(new ContentImporterContextAdapter(context, evt));
}

/// <summary>
/// Start a content context operation.
/// </summary>
/// <param name="scope"></param>
/// <returns>
/// <b>this return value must be disposed when the context operation is complete!</b>
/// </returns>
public static IContentContext BeginContext(IContentContext scope)
{
if (_contextStack.Value == null)
_contextStack.Value = new List<IContentContext>(1);

_contextStack.Value.Add(scope);
_activeContext.Value = scope;
return scope;
}

/// <summary>
/// The default implementation of the <see cref="IContentContext"/>
/// provides a basic Dispose() method that will remove the context
/// from the history in the <see cref="ContextScopeFactory"/>
/// </summary>
public abstract class ContextScope : IContentContext
{
public abstract string IntermediateDirectory { get; }
public abstract ContentBuildLogger Logger { get; }
public abstract ContentIdentity SourceIdentity { get; }

/// <summary>
/// Remove this context operation from the history in the <see cref="ContextScopeFactory"/>.
/// If this context was the <see cref="ContextScopeFactory.ActiveContext"/>, then this
/// method will reset the <see cref="ContextScopeFactory.ActiveContext"/> value to the next
/// most-recent context operation, or null if none exist.
/// </summary>
public virtual void Dispose()
{
_contextStack.Value.Remove(this);

// if someone else has already claimed the activeContext, then we don't need to care.
if (_activeContext.Value != this) return;

// either use the "most recent" (aka, last) context, or if the list is empty,
// there is no context.
_activeContext.Value = _contextStack.Value.Count > 0
? _contextStack.Value[^1]
: null;
}

}

private class ContentProcessorContextAdapter : ContextScope
{
private readonly ContentProcessorContext _context;

public ContentProcessorContextAdapter(ContentProcessorContext context)
{
_context = context;
}

public override string IntermediateDirectory => _context.IntermediateDirectory;
public override ContentBuildLogger Logger => _context.Logger;
public override ContentIdentity SourceIdentity => _context.SourceIdentity;

public override void Dispose()
{
base.Dispose();
if (_context is IDisposable disposable)
{
disposable.Dispose();
}
}
}

private class ContentImporterContextAdapter : ContextScope
{
private readonly ContentImporterContext _context;

public ContentImporterContextAdapter(ContentImporterContext context, PipelineBuildEvent evt)
{
_context = context;
SourceIdentity = new ContentIdentity(sourceFilename: evt.SourceFile);
}

public override string IntermediateDirectory => _context.IntermediateDirectory;
public override ContentBuildLogger Logger => _context.Logger;
public override ContentIdentity SourceIdentity { get; }
}
}
}
Loading

0 comments on commit 7a398b0

Please sign in to comment.