Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ResourceExhaustedError in train_c3d_ucf101.py #122

Open
Wei-i opened this issue Mar 4, 2020 · 2 comments
Open

ResourceExhaustedError in train_c3d_ucf101.py #122

Wei-i opened this issue Mar 4, 2020 · 2 comments

Comments

@Wei-i
Copy link

Wei-i commented Mar 4, 2020

I am sorry to disturb you, authors. I guess that this error is probably caused by my poor GPU configuration(NVIDIA GTX960M 2G GDDR5).
Which codes can I change or optimize to suit my poor computer? I am sincerely looking forward for your answer, thanks!

@rocksyne
Copy link

rocksyne commented Mar 4, 2020

Try this tool from the tensorflow library.

tf.data.Dataset.from_generator

It will help use a generator function to load data in batches. I am providing a pseudocode example below.

# now the generator function
def data_generator(original_data:list=None,batch_size:int=10)->list:

	# make sure the input data is not none
	if original_data is None:
		sys.exit("Input data invalid!")

	#we need to run this forever
	while 1:

		# run a for loop to be able to fetch the data in batches
		for batch in range(0,len(original_data),batch_size):
			data = original_data[batch:batch+batch_size] # fetch bacthes
			
			# I am not sure if this should be a numpy array
			# but if it should, please use the appropriate type casting
			# data = np.array(data).astype(np.float32) --> this is an example
			# Find your appropriate casting and set it as such

			# yield this current batch of data
			yield data


"""
H0w to use this in your code
-------------------------------------------------

import tensorflow as tf
from data_generator import data_generator

custom_genenerator = data_generator(parameter_1, optional_param_2)

dataset = tf.data.Dataset.from_generator(custom_genenerator,(tf.float32, tf.int16))
iterator = dataset.make_one_shot_iterator()

x,y = iterator.get_next()



Check the following references for explanation
--------------------------------------------------
Ref: 
https://sknadig.me/TensorFlow2.0-dataset/
https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_generator

"""

@Wei-i
Copy link
Author

Wei-i commented Mar 5, 2020

Try this tool from the tensorflow library.

tf.data.Dataset.from_generator

It will help use a generator function to load data in batches. I am providing a pseudocode example below.

# now the generator function
def data_generator(original_data:list=None,batch_size:int=10)->list:

	# make sure the input data is not none
	if original_data is None:
		sys.exit("Input data invalid!")

	#we need to run this forever
	while 1:

		# run a for loop to be able to fetch the data in batches
		for batch in range(0,len(original_data),batch_size):
			data = original_data[batch:batch+batch_size] # fetch bacthes
			
			# I am not sure if this should be a numpy array
			# but if it should, please use the appropriate type casting
			# data = np.array(data).astype(np.float32) --> this is an example
			# Find your appropriate casting and set it as such

			# yield this current batch of data
			yield data


"""
H0w to use this in your code
-------------------------------------------------

import tensorflow as tf
from data_generator import data_generator

custom_genenerator = data_generator(parameter_1, optional_param_2)

dataset = tf.data.Dataset.from_generator(custom_genenerator,(tf.float32, tf.int16))
iterator = dataset.make_one_shot_iterator()

x,y = iterator.get_next()



Check the following references for explanation
--------------------------------------------------
Ref: 
https://sknadig.me/TensorFlow2.0-dataset/
https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_generator

"""

Thanks for replying!But I am sorry I have not made progress yet.
Are there any codes about batch size that can be modified to reduce directly?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants