This repo is the excises of course TensorFlow2.0-from-zero-to-advanced
. Thanks for the tutor of this course.
The outline of this course and excises lists:
- What is Tensorflow?
- Tensorflow version changes and tf1.0 architecture
- Tensorflow2.0 architecture
- Tensorflow VS pytorch
- Tensorflow environment configuration
- Google_cloud without GPU environment
- Google_cloud_Remote jupyter_notebook configuration
- Google_cloud_gpu_tensorflow configuration
- Google_cloud_gpu_tensorflow mirror configuration
- AWS cloud platform environment configuration
- tfkeras brief introduction
- Classification regression and objective function
- Data reading and display of actual classification model
- Model Construction of Classification Model
- Data normalization of classification model
- Callbacks
- Regression model
- Neural netorks
- Deep neural network
- Batch normalization, activation function, dropout
- wide_deep model
- Function API to implement wide & deep model
- Subclass API to implement wide & deep model
- Multi-input and multi-output excises of wide & deep model
- Hyperparameter search
- Manual implementation of hyperparameter search
- sklearn package keras model
- sklearn hyperparameter search
tf.constant
tf.strings
andragged tensor
sparse tensor
andtf.Variable
- self define
loss function
andDenseLayer
- Use
subclasses
andlambdas
to define levels separately tf.function
function conversion@tf.function
function conversion- Function signature and graph structure
- Approximate derivative
tf.GradientTape
Basic Usagetf.GradientTape
与tf.keras
结合使用
tf_data
basic API usage- Generate csv file
tf.io.decode_csv
usagetf.data
reads a csv file and uses it withtf.keras
tfrecord
basic API usage- Generate
tfrecords
file tf.data
reads tfrecord file and uses it with tf.keras
- Titanic problem analysis
feature_column
usagekeras_to_estimator
- Predefined estimator usage
- Cross feature excises
- TF1.0 computational graph construction
- TF1.0 model training
- TF1_dataset usage
- TF1 self defined estimator
- Problems solved by convolution
- Calculation of convolution
- Pooling operation
- CNN excises
- Deep separable convolutional network
- Deep separable convolutional network excises
- 10monkeys dataset
- Keras generator reads data
10monkeys
basic model building and training- 10monkeys model fine-tuning
- Keras generator reads
cifar10
dataset - Model training and prediction
- RNN and embedding
- Data set loading and construction of vocabulary index
- Data padding, model construction and training
- Sequential problems and recurrent neural networks
- Text Classification by RNN
- Data processing for text generation
- Model construction for text generation
- Sample text for text generation
- LSTM
- Text classification and text generation by LSTM
- Dataset loading and tokenizer for subword text classification
- Dataset transformation and model training for subword text classification
- GPU settings
- GPU default settings
- Memory growth and virtual device excises
- GPU manual settings excises
- Distribution strategy
- Keras distribution excises
- Estimator distribution excises
- Self define process excises
- Distributed self define process excises
TFLite_x264
- Save model structure plus parameters and save parameters in practice
- Keras model convert to
SavedModel
Signature function
convert toSavedModel
- Signature function, SavedModel and Keras model convert to concrete function
tflite
preservation and interpretation and quantificationtensorflowjs
convert modeltensorflowjs
build server and load modelAndroid
deployment model
seq2seq+attention
model- Data preprocessing and reading
- Convert string to id and dataset generation
- Build Encoder
- Build attention
- Build decoder
- Loss function and one step trainning
- Model training
- Model prediction
Transformer
- Encoder-Decoder with Zoom click attention
- Multi head attention with position encoding
- Data preprocessing and dataset generation
- Position encoding
- Build mask
- Build Zoom click attention
- Build Multi head attention
- Feedforward layer
- Encoder layer
- Decoder layer
- Encoder model
- Decoder model
- Transformer
- Self define learning rate
- Model training and evaluation