Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nose.proxy.UnsupportedOperation: fileno #73

Open
haihai-00 opened this issue Aug 22, 2017 · 1 comment
Open

nose.proxy.UnsupportedOperation: fileno #73

haihai-00 opened this issue Aug 22, 2017 · 1 comment

Comments

@haihai-00
Copy link

haihai-00 commented Aug 22, 2017

I run nosetests under test directory and I got the following error, I'm using python 3.4.3, Theano (0.10.0b1), DIRECT (1.0.1), cma (2.2.0):

..............DIRECT Version 2.0.4
Problem Dimension n : 3
Eps value : 0.1000E-03
Epsilon is constant.
Maximum number of f-evaluations (maxf) : 200
Maximum number of iterations (MaxT) : 200
Value of f_global : -0.1000+101
Global percentage wanted : 0.1000E-01
Volume percentage wanted : -0.1000E+01
Measure percentage wanted : -0.1000E+01
Jones original DIRECT algorithm is used.
Bounds on variable x 1 : 0.00000 <= xi <= 3.00000
Bounds on variable x 2 : 0.00000 <= xi <= 3.00000
Bounds on variable x 3 : 0.00000 <= xi <= 1.00000

Call-back cb_fcn_in_direct__user__routines failed.
EEE.DIRECT Version 2.0.4
Problem Dimension n : 3
Eps value : 0.1000E-03
Epsilon is constant.
Maximum number of f-evaluations (maxf) : 200
Maximum number of iterations (MaxT) : 200
Value of f_global : -0.1000+101
Global percentage wanted : 0.1000E-01
Volume percentage wanted : -0.1000E+01
Measure percentage wanted : -0.1000E+01
Jones original DIRECT algorithm is used.
Bounds on variable x 1 : 0.00000 <= xi <= 3.00000
Bounds on variable x 2 : 0.00000 <= xi <= 3.00000
Bounds on variable x 3 : 0.00000 <= xi <= 1.00000

Call-back cb_fcn_in_direct__user__routines failed.
E....E..EE....................EE......EE

ERROR: test_bayesian_optimization (test.test_fmin.test_fabolas.TestFminInterfaceFabolas)

Traceback (most recent call last):
File "/home/yujiezeng/RoBO-master/test/test_fmin/test_fabolas.py", line 32, in test_bayesian_optimization
num_iterations=3)
File "/home/yujiezeng/RoBO-master/robo/fmin/fabolas.py", line 255, in fabolas
new_x = maximizer.maximize()
File "/home/yujiezeng/RoBO-master/robo/maximizers/direct.py", line 58, in maximize
maxf=self.n_func_evals)
File "/usr/local/lib/python3.4/dist-packages/DIRECT/init.py", line 202, in solve
cdata
nose.proxy.ValueError: data type must provide an itemsize
-------------------- >> begin captured logging << --------------------
robo.fmin.fabolas: INFO: Initial Design
robo.fmin.fabolas: INFO: Evaluate [ 1.69439373 0.482941 ] on subset size 1000
robo.fmin.fabolas: INFO: Configuration achieved a performance of 0.269781 with cost 0.666667
robo.fmin.fabolas: INFO: Evaluation of this configuration took 0.000032 seconds
robo.fmin.fabolas: INFO: Evaluate [ 1.52588927 2.30983096] on subset size 500
robo.fmin.fabolas: INFO: Configuration achieved a performance of 0.386193 with cost 0.566323
robo.fmin.fabolas: INFO: Evaluation of this configuration took 0.000031 seconds
robo.fmin.fabolas: INFO: Start iteration 2 ...
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.02097522 0.59755141 -8.81338374 -3.43134297 -5.07902244 -9.87174703]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.81730834 -0.89709109 -9.10406393 -19.87679305 2.31857963
-14.92482045]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 4.57202765 0.29965072 1.77546915 -3.03804691 -19.05497834
-16.63347632]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.305978 -2.17223643 -3.61420603 -0.35647238 -10.10918047
-7.69056374]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 2.18240973 -2.38925303 1.34713723 1.4512109 1.56262757
-11.91145864]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.15320771 -9.22795705 -0.88630784 -1.34085912 -6.383649
-10.20094539]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 2.88490038 -9.1242683 -6.35461619 -11.29592113 -1.32902471
-17.66490343]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.1464228 -7.23735333 1.66055707 -3.00008653 -5.77473881
-17.0431326 ]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 1.01592742 -0.29636148 -5.94322631 -14.02110886 0.16381314
-14.29859068]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.98807613 -0.79486195 -7.01943626 -5.09050946 0.56289086
-12.09571915]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -0.47556749 -6.85544014 -6.50232686 -3.62445204 -5.76107346
-12.30878829]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.19899653 -6.51725148 -9.8666215 -10.85772118 0.55376277
-10.09597154]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.3777787 -0.58384092 -9.33479475 -4.15056335 0.12519402
-10.60977827]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.65784574 -4.88218857 -2.83135226 -5.02784295 -1.3545752
-19.46909648]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.56998176 -1.10277724 -4.72971497 -0.69255535 0.71320786
-19.93896946]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.30471957 -0.71006325 -1.84545467 -13.69680313 -2.44366553
-16.95352385]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [-0.67952785 -2.73973835 -0.76871713 -2.06452097 -7.53712268 -8.35781702]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.13187474 -5.98322734 -2.99208718 -13.03063135 0.79540068
-19.62443926]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [-1.38687854 -6.6496944 -1.31871034 -4.86179704 -2.70272752 -8.75946565]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.613897 -0.50082838 -1.13929317 -10.68350368 -2.18560741
-18.67054974]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.30575644 -6.78427331 -3.2517114 -2.15303744 -1.95694001
-11.02679194]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 4.45996225 -1.41815539 -4.81456899 -5.44968798 -5.11196495
-15.52584596]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.22694202 -8.57463617 -3.11717172 -0.11567946 1.00820722
-14.75140825]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ -1.43667884 -3.65472677 -7.22656167 -4.715566 -1.13186204
-11.0919997 ]
robo.fmin.fabolas: INFO: Current incumbent [ 1.69439373 0.482941 1. ] with estimated performance 0.289487
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_bayesian_optimization (test.test_fmin.test_fmin_interface.TestFminInterface)

Traceback (most recent call last):
File "/home/yujiezeng/RoBO-master/test/test_fmin/test_fmin_interface.py", line 31, in test_bayesian_optimization
num_iterations=3)
File "/home/yujiezeng/RoBO-master/robo/fmin/bayesian_optimization.py", line 133, in bayesian_optimization
x_best, f_min = bo.run(num_iterations)
File "/home/yujiezeng/RoBO-master/robo/solver/bayesian_optimization.py", line 167, in run
new_x = self.choose_next(self.X, self.y, do_optimize)
File "/home/yujiezeng/RoBO-master/robo/solver/bayesian_optimization.py", line 245, in choose_next
x = self.maximize_func.maximize()
File "/home/yujiezeng/RoBO-master/robo/maximizers/direct.py", line 60, in maximize
fileno = sys.stdout.fileno()
nose.proxy.UnsupportedOperation: fileno
-------------------- >> begin captured logging << --------------------
robo.solver.bayesian_optimization: INFO: Evaluate: [ 0.44798606]
robo.solver.bayesian_optimization: INFO: Configuration achieved a performance of 0.002705 in 0.000012 seconds
robo.solver.bayesian_optimization: INFO: Evaluate: [ 0.75128557]
robo.solver.bayesian_optimization: INFO: Configuration achieved a performance of 0.063144 in 0.000009 seconds
robo.solver.bayesian_optimization: INFO: Start iteration 2 ...
robo.solver.bayesian_optimization: INFO: Train model...
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.58708092 1.57967465 -16.1517907 ]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.18859492 0.89298495 -13.25663879]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.31369317 1.94651497 -13.96885373]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.4820195 0.41696565 -11.91200299]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.52865711 -1.45640437 -3.96483207]
robo.models.gaussian_process: DEBUG: GP Hyperparameters: [ 0.32426576 -1.62946687 -12.4438618 ]
robo.solver.bayesian_optimization: INFO: Time to train the model: 1.200303
robo.solver.bayesian_optimization: INFO: Maximize acquisition function...
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_bohamiann (test.test_fmin.test_fmin_interface.TestFminInterface)

Traceback (most recent call last):
File "/home/yujiezeng/RoBO-master/test/test_fmin/test_fmin_interface.py", line 41, in test_bohamiann
num_iterations=3)
File "/home/yujiezeng/RoBO-master/robo/fmin/bohamiann.py", line 79, in bohamiann
x_best, f_min = bo.run(num_iterations)
File "/home/yujiezeng/RoBO-master/robo/solver/bayesian_optimization.py", line 167, in run
new_x = self.choose_next(self.X, self.y, do_optimize)
File "/home/yujiezeng/RoBO-master/robo/solver/bayesian_optimization.py", line 245, in choose_next
x = self.maximize_func.maximize()
File "/home/yujiezeng/RoBO-master/robo/maximizers/direct.py", line 60, in maximize
fileno = sys.stdout.fileno()
nose.proxy.UnsupportedOperation: fileno

@aaronkl
Copy link
Contributor

aaronkl commented Sep 8, 2017

Hi, it seems that this is the same problem as in #75 . Could you pull the master branch again?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants