You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was messing around with an application that is very very heavy XML based/parsing and they track each element with a 4-byte identifier. I was hoping to make a large enum with each identifier and some made up name (right now its just xml_value_%08x % (val)). This enum got to be very large once I scripted out creating it. When I start using this enum in the image the decompiler starts timing out/stopping with memory constraint issues with functions that have this enum being used (works fine as uint).
# create the enummy_enum=EnumDatatype(my_cat_path, my_enum_name, 4)
dtm.addDatatype(my_enum, None)
dtm.flushEvents()
# other stuff# all the logic to enumerate all the values is done here, get the type and add themenumdt=getDataTypes(my_enum_name)[0]
forname,valueinmy_enum_values:
enumdt.add(name, value)
Like on the order of 65k values when I let the script go all the way, trimming down to around 12k seemed to reduce the issue to allow the decompiler to finish decompilation (50MB limit).
Any thoughts on this approach, is this enum just not going to be usable with this approach? I'm not certain on feasibility yet, but maybe a union of a bunch of enums if I can categorize them somehow and then figure out how to cast them to the right enum from the union. Is this a bug that a large enum would do this or semi-expected.
(@ghidra1 / @caheckman sorry for the ping but figured you two would be good to ask and/or might have thoughts on this)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I was messing around with an application that is very very heavy XML based/parsing and they track each element with a 4-byte identifier. I was hoping to make a large enum with each identifier and some made up name (right now its just
xml_value_%08x % (val)
). This enum got to be very large once I scripted out creating it. When I start using this enum in the image the decompiler starts timing out/stopping with memory constraint issues with functions that have this enum being used (works fine asuint
).Like on the order of 65k values when I let the script go all the way, trimming down to around 12k seemed to reduce the issue to allow the decompiler to finish decompilation (50MB limit).
Any thoughts on this approach, is this enum just not going to be usable with this approach? I'm not certain on feasibility yet, but maybe a union of a bunch of enums if I can categorize them somehow and then figure out how to cast them to the right enum from the union. Is this a bug that a large enum would do this or semi-expected.
(@ghidra1 / @caheckman sorry for the ping but figured you two would be good to ask and/or might have thoughts on this)
10.1.4 / win10 / 11.0.10
Beta Was this translation helpful? Give feedback.
All reactions