Skip to content

Commit

Permalink
Merge branch '1.4_maintenance'
Browse files Browse the repository at this point in the history
  • Loading branch information
johnhaddon committed Aug 27, 2024
2 parents c9b577f + 0dd8486 commit 82e1e9c
Show file tree
Hide file tree
Showing 7 changed files with 174 additions and 40 deletions.
20 changes: 19 additions & 1 deletion Changes.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,12 +70,28 @@ Breaking Changes
- Loop : Removed `nextIterationContext()` method.
- NodeGadget, ConnectionGadget : Removed `activeForFocusNode()` virtual methods. Override `updateFromContextTracker()` instead.

1.4.x.x (relative to 1.4.10.0)
1.4.x.x (relative to 1.4.11.0)
=======

Improvements
------------

- UI Editor :
- Added the ability to edit the scale of node icons.
- Improved layout of Box node plug creator visibility toggles.
- ArnoldShader : Moved the `toon` shader's `*_tonemap_hue_saturation` parameters to appropriate sections in the UI.

API
---

- MetadataWidget : Added `NumericMetadataWidget` class.

1.4.11.0 (relative to 1.4.10.0)
========

Improvements
------------

- SetExpressions : Set Expressions containing only whitespace characters are now treated as empty rather than producing an error.
- ArnoldShader :
- Added a UI layout for the new `openpbr_surface` shader.
Expand All @@ -86,6 +102,7 @@ Improvements
- Blue : Proxy
- Red : Guide
- Catalogue : Added a handle for controlling the relative sizes of the listing and image property widgets.
- RenderPassEditor, LightEditor : Improved update performance for certain graph configurations, by optimising `SceneAlgo::history()` (#5199).

Fixes
-----
Expand All @@ -98,6 +115,7 @@ Fixes
- Fixed bug which allowed locked Catalogues to be edited.
- Fixed NodeEditor update when the first image is added or the last image is removed.
- NameWidget : Fixed bug which allowed plugs on locked nodes to be renamed.
- ValuePlug : Fixed the plug passed to `Monitor::forceMonitoring()`. Previously `Process::destinationPlug()` was being passed instead of `Process::plug()`.

API
---
Expand Down
14 changes: 14 additions & 0 deletions arnoldPlugins/gaffer.mtd
Original file line number Diff line number Diff line change
Expand Up @@ -3035,6 +3035,10 @@
gaffer.graphEditorLayout.visible BOOL true
page STRING "Edge"
gaffer.layout.activator STRING "edgeEnabled"
[attr edge_tonemap_hue_saturation]
page STRING "Edge"
label STRING "Tonemap Hue Saturation"
gaffer.layout.activator STRING "edgeEnabled"
[attr edge_opacity]
page STRING "Edge"
gaffer.layout.activator STRING "edgeEnabled"
Expand Down Expand Up @@ -3083,6 +3087,10 @@
gaffer.graphEditorLayout.visible BOOL true
page STRING "Silhouette"
gaffer.layout.activator STRING "silhouetteEnabled"
[attr silhouette_tonemap_hue_saturation]
page STRING "Silhouette"
label STRING "Tonemap Hue Saturation"
gaffer.layout.activator STRING "silhouetteEnabled"
[attr silhouette_opacity]
page STRING "Silhouette"
gaffer.layout.activator STRING "silhouetteEnabled"
Expand All @@ -3099,6 +3107,9 @@
[attr base_tonemap]
gaffer.graphEditorLayout.visible BOOL true
page STRING "Base"
[attr base_tonemap_hue_saturation]
page STRING "Base"
label STRING "Tonemap Hue Saturation"

[attr specular]
page STRING "Specular"
Expand All @@ -3112,6 +3123,9 @@
page STRING "Specular"
[attr specular_tonemap]
page STRING "Specular"
[attr specular_tonemap_hue_saturation]
page STRING "Specular"
label STRING "Tonemap Hue Saturation"

[attr lights]
page STRING "Stylized Highlight"
Expand Down
40 changes: 40 additions & 0 deletions python/GafferSceneTest/SceneAlgoTest.py
Original file line number Diff line number Diff line change
Expand Up @@ -1946,6 +1946,46 @@ def assertNoCanceller( history ) :

assertNoCanceller( history )

@GafferTest.TestRunner.PerformanceTestMethod()
def testHistoryDiamondPerformance( self ) :

# A series of diamonds where every iteration reads the output of the
# previous iteration twice and then feeds into the next iteration.
#
# o
# / \
# o o
# \ /
# o
# / \
# o o
# \ /
# o
# / \
# ...
# \ /
# o
#
# Without caching, this leads to an explosion of paths through the
# graph, and can lead to poor performance in `history()`.

plane = GafferScene.Plane()

loop = Gaffer.Loop()
loop.setup( plane["out"] )
loop["in"].setInput( plane["out"] )

copyOptions = GafferScene.CopyOptions()
copyOptions["in"].setInput( loop["previous"] )
copyOptions["source"].setInput( loop["previous"] )
copyOptions["options"].setValue( "*" )

loop["next"].setInput( copyOptions["out"] )
loop["iterations"].setValue( 20 )

with GafferTest.TestRunner.PerformanceScope() :
GafferScene.SceneAlgo.history( loop["out"]["globals"] )

def testLinkingQueries( self ) :

# Everything linked to `defaultLights` via the default value for the attribute.
Expand Down
44 changes: 42 additions & 2 deletions python/GafferUI/MetadataWidget.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,12 +109,12 @@ def _updateFromValue( self, value ) :

## Must be called by derived classes to update
# the Metadata value when the widget value changes.
def _updateFromWidget( self, value ) :
def _updateFromWidget( self, value, mergeGroup = "" ) :

if self.__target is None :
return

with Gaffer.UndoScope( self.__target.ancestor( Gaffer.ScriptNode ) ) :
with Gaffer.UndoScope( self.__target.ancestor( Gaffer.ScriptNode ), mergeGroup = mergeGroup ) :
Gaffer.Metadata.registerValue( self.__target, self.__key, value )

## May be called by derived classes to deregister the
Expand Down Expand Up @@ -347,3 +347,43 @@ def __buttonClicked( self, widget ) :
if chosenPath is not None :
self.__path.setFromString( str( chosenPath ) )
self.__editingFinished()

class NumericMetadataWidget( MetadataWidget ) :

def __init__( self, key, target = None, defaultValue = 0, **kw ) :

self.__numericWidget = GafferUI.NumericWidget( value = defaultValue )

self.__defaultValue = defaultValue
# we use these to decide which actions to merge into a single undo
self.__lastChangedReason = None
self.__mergeGroupId = 0

MetadataWidget.__init__( self, self.__numericWidget, key, target, defaultValue = defaultValue, **kw )

self.__numericWidget.valueChangedSignal().connect(
Gaffer.WeakMethod( self.__valueChanged ), scoped = False
)

def numericWidget( self ) :

return self.__numericWidget

def _updateFromValue( self, value ) :

self.__numericWidget.setValue( type( self.__defaultValue )( value ) )

def __valueChanged( self, widget, reason ) :

if reason == GafferUI.NumericWidget.ValueChangedReason.InvalidEdit :
self._updateFromValue( self.defaultValue() )
return

if not widget.changesShouldBeMerged( self.__lastChangedReason, reason ) :
self.__mergeGroupId += 1
self.__lastChangedReason = reason

self._updateFromWidget(
self.__numericWidget.getValue(),
mergeGroup = "NumericMetadataWidget{}{}".format( id( self, ), self.__mergeGroupId )
)
10 changes: 8 additions & 2 deletions python/GafferUI/UIEditor.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,12 +111,18 @@ def __init__( self, scriptNode, **kw ) :
MetadataWidget.FileSystemPathMetadataWidget( key = "icon" )
)

GafferUI.Label( "Scale" )

scaleWidget = MetadataWidget.NumericMetadataWidget( key = "iconScale", defaultValue = 1.5 )
scaleWidget.numericWidget()._qtWidget().setMaximumWidth( 60 )
self.__nodeMetadataWidgets.append( scaleWidget )

with _Row() as self.__plugAddButtons :

_Label( "Plug Creators" )

for side in ( "Top", "Bottom", "Left", "Right" ) :
_Label( side )._qtWidget().setFixedWidth( 40 )
GafferUI.Label( side )
self.__nodeMetadataWidgets.append( MetadataWidget.BoolMetadataWidget(
key = "noduleLayout:customGadget:addButton%s:visible" % side,
defaultValue = True
Expand Down Expand Up @@ -400,7 +406,7 @@ class _Row( GafferUI.ListContainer ) :

def __init__( self, *args, **kw ) :

GafferUI.ListContainer.__init__( self, GafferUI.ListContainer.Orientation.Horizontal, spacing = 4, *args, **kw )
GafferUI.ListContainer.__init__( self, GafferUI.ListContainer.Orientation.Horizontal, spacing = 8, *args, **kw )

##########################################################################
# Hierarchical representation of a plug layout, suitable for manipulating
Expand Down
2 changes: 1 addition & 1 deletion src/Gaffer/ValuePlug.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@ class ValuePlug::HashProcess : public Process
// Then get our hash. We do this using this `acquireHash()` functor so that
// we can repeat the process for `Checked` mode.

const bool forceMonitoring = Process::forceMonitoring( threadState, plug, staticType );
const bool forceMonitoring = Process::forceMonitoring( threadState, p, staticType );

auto acquireHash = [&]( const HashCacheKey &cacheKey ) {

Expand Down
84 changes: 50 additions & 34 deletions src/GafferScene/SceneAlgo.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -437,8 +437,6 @@ class CapturingMonitor : public Monitor
{
const Plug *p = process->plug();

CapturedProcess::Ptr capturedProcess;
ProcessOrScope entry;
if( !shouldCapture( p ) )
{
// Parents may spawn other processes in support of the requested plug. This is one
Expand All @@ -450,39 +448,44 @@ class CapturingMonitor : public Monitor
// order of the stack is preserved - if this happens out of order, the stack will be
// corrupted, and weird crashes will happen. But as long as it is created in
// processStarted, and released in processFinished, everything should line up.
entry = std::make_unique<Monitor::Scope>( this, false );
}
else
{
capturedProcess = std::make_unique<CapturedProcess>();
capturedProcess->type = process->type();
capturedProcess->plug = p;
capturedProcess->destinationPlug = process->destinationPlug();
capturedProcess->context = new Context( *process->context(), /* omitCanceller = */ true );
entry = capturedProcess.get();
Mutex::scoped_lock lock( m_mutex );
m_processMap[process] = std::make_unique<Monitor::Scope>( this, false );
return;
}

// Capture this process.

CapturedProcess::Ptr capturedProcess = std::make_unique<CapturedProcess>();
capturedProcess->type = process->type();
capturedProcess->plug = p;
capturedProcess->destinationPlug = process->destinationPlug();
capturedProcess->context = new Context( *process->context(), /* omitCanceller = */ true );

Mutex::scoped_lock lock( m_mutex );
m_processMap[process] = std::move( entry );
m_processMap[process] = capturedProcess.get();

if( capturedProcess )
ProcessMap::const_iterator it = m_processMap.find( process->parent() );
if( it != m_processMap.end() )
{
ProcessMap::const_iterator it = m_processMap.find( process->parent() );
if( it != m_processMap.end() )
CapturedProcess * const * parent = std::get_if<CapturedProcess *>( &it->second );
if( parent && *parent )
{
CapturedProcess * const * parent = std::get_if<CapturedProcess *>( &it->second );
if( parent && *parent )
{
(*parent)->children.push_back( std::move( capturedProcess ) );
}
}
else
{
// Either `process->parent()` was null, or was started
// before we were made active via `Monitor::Scope`.
m_rootProcesses.push_back( std::move( capturedProcess ) );
(*parent)->children.push_back( std::move( capturedProcess ) );
}
}
else
{
// Either `process->parent()` was null, or was started
// before we were made active via `Monitor::Scope`.
m_rootProcesses.push_back( std::move( capturedProcess ) );
}

// Remember that we've monitored this process so that we dont force
// its monitoring again.

IECore::MurmurHash h = process->context()->hash();
h.append( reinterpret_cast<intptr_t>( p ) );
m_monitoredSet.insert( h );
}

void processFinished( const Process *process ) override
Expand All @@ -498,12 +501,22 @@ class CapturingMonitor : public Monitor

bool forceMonitoring( const Plug *plug, const IECore::InternedString &processType ) override
{
if( processType == g_hashProcessType && shouldCapture( plug ) )
if( processType != g_hashProcessType || !shouldCapture( plug ) )
{
return true;
return false;
}

return false;
// Don't force the monitoring of a process we've monitored already. This does
// mean we throw away diamond dependencies in the process graph, but it is essential
// for performance in some cases - see `testHistoryDiamondPerformance()` for example.
/// \todo Potentially we could use the hash to find the previously captured process,
/// and instance that into our process graph. This would require clients of `History`
/// to be updated to handle such topologies efficiently by tracking previously visited
/// items. It may also be of fairly low value, since typically our goal is to find the
/// first relevant path through the graph to present to the user.
IECore::MurmurHash h = Context::current()->hash();
h.append( reinterpret_cast<intptr_t>( plug ) );
return !m_monitoredSet.count( h );
}

private :
Expand All @@ -518,15 +531,18 @@ class CapturingMonitor : public Monitor
}

using Mutex = tbb::spin_mutex;

Mutex m_mutex;

using ProcessOrScope = std::variant<CapturedProcess *, std::unique_ptr<Monitor::Scope>>;
using ProcessMap = boost::unordered_map<const Process *, ProcessOrScope>;

const IECore::InternedString m_scenePlugChildName;

// Protects `m_processMap` and `m_rootProcesses`.
/// \todo Perhaps they should be concurrent containers instead?
Mutex m_mutex;
ProcessMap m_processMap;
CapturedProcess::PtrVector m_rootProcesses;
IECore::InternedString m_scenePlugChildName;

tbb::concurrent_unordered_set<IECore::MurmurHash> m_monitoredSet;

};

Expand Down

0 comments on commit 82e1e9c

Please sign in to comment.