Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoadError: UndefVarError: #flatten not defined #191

Open
smart-fr opened this issue May 2, 2023 · 0 comments
Open

LoadError: UndefVarError: #flatten not defined #191

smart-fr opened this issue May 2, 2023 · 0 comments

Comments

@smart-fr
Copy link

smart-fr commented May 2, 2023

I think I messed with my installation of AlphaZero, and I can't recollect exactly how this occurred.

When launching a training, I get the following error:

Loading environment from: sessions/bonbon-rectangle-128_train

ERROR: LoadError: UndefVarError: #flatten not defined
Stacktrace:
  [1] deserialize_datatype(s::Serialization.Serializer{IOStream}, full::Bool)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:1364
  [2] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:866
  [3] deserialize(s::Serialization.Serializer{IOStream})
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:813
  [4] deserialize_datatype(s::Serialization.Serializer{IOStream}, full::Bool)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:1383
  [5] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:866
  [6] deserialize(s::Serialization.Serializer{IOStream})
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:813
  [7] deserialize_datatype(s::Serialization.Serializer{IOStream}, full::Bool)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:1388
  [8] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:866
  [9] deserialize(s::Serialization.Serializer{IOStream})
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:813
 [10] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:873
 [11] deserialize(s::Serialization.Serializer{IOStream}, t::DataType)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:1467
 [12] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:882
 [13] deserialize(s::Serialization.Serializer{IOStream})
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:813
 [14] handle_deserialize(s::Serialization.Serializer{IOStream}, b::Int32)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:919
 [15] deserialize
    @ C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:813 [inlined]
 [16] deserialize(s::IOStream)
    @ Serialization C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:800
 [17] open(f::typeof(Serialization.deserialize), args::String; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Base .\io.jl:384
 [18] open
    @ .\io.jl:381 [inlined]
 [19] deserialize
    @ C:\Program Files\Julia-1.8.5\share\julia\stdlib\v1.8\Serialization\src\Serialization.jl:810 [inlined]
 [20] load_env(dir::String)
    @ AlphaZero.UserInterface C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\src\ui\session.jl:113
 [21] AlphaZero.UserInterface.Session(e::AlphaZero.Experiments.Experiment; dir::String, autosave::Bool, nostdout::Bool, save_intermediate::Bool)
    @ AlphaZero.UserInterface C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\src\ui\session.jl:286
 [22] train(e::AlphaZero.Experiments.Experiment; args::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:save_intermediate, :dir), Tuple{Bool, String}}})
    @ AlphaZero.Scripts C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\src\scripts\scripts.jl:26
 [23] #train#15
    @ C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\src\scripts\scripts.jl:28 [inlined]
 [24] top-level scope
    @ C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\bin\main.jl:23
in expression starting at C:\Projets\BonbonRectangle\IA\dev\AlphaZero.jl\bin\main.jl:23

If I Google UndefVarError: #flatten not defined, I get in the release notes of the Flux.jl package that its recent version 0.13.14 fixed this issue: https://github.com/FluxML/Flux.jl/releases.

However, this version seems to be incompatible with AlphaZero.jl. Indeed, if I modify the file Project.toml in my customized version of AlphaZero.jl in order to update the Flux package to the 0.13.14 version, I get the following error:

ERROR: Unsatisfiable requirements detected for package cuDNN [02a925ec]:
 cuDNN [02a925ec] log:
 ├─possible versions are: 1.0.0-1.0.2 or uninstalled
 └─restricted by compatibility requirements with CUDA [052768ef] to versions: uninstalled — no versions left
   └─CUDA [052768ef] log:
     ├─possible versions are: 0.1.0-4.2.0 or uninstalled
     └─restricted to versions 3 by AlphaZero [7a1cc850], leaving only versions 3.0.0-3.13.1
       └─AlphaZero [7a1cc850] log:
         ├─possible versions are: 0.5.4 or uninstalled
         └─AlphaZero [7a1cc850] is fixed to version 0.5.4

Maybe because that version of Flux requires a version of CUDA which is incompatible with AlphaZero restrictions?

I don't know how to move forward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant