MindMap Gallery Computer Vision and Deep Learning
Preliminary introduction to computer vision and deep learning. A linear classifier is a linear mapping that maps input image features to category scores. I hope you have a general understanding.
Edited at 2023-07-27 22:45:29This is a mind map about bacteria, and its main contents include: overview, morphology, types, structure, reproduction, distribution, application, and expansion. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about plant asexual reproduction, and its main contents include: concept, spore reproduction, vegetative reproduction, tissue culture, and buds. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about the reproductive development of animals, and its main contents include: insects, frogs, birds, sexual reproduction, and asexual reproduction. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about bacteria, and its main contents include: overview, morphology, types, structure, reproduction, distribution, application, and expansion. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about plant asexual reproduction, and its main contents include: concept, spore reproduction, vegetative reproduction, tissue culture, and buds. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about the reproductive development of animals, and its main contents include: insects, frogs, birds, sexual reproduction, and asexual reproduction. The summary is comprehensive and meticulous, suitable as review materials.
Computer Vision and Deep Learning
Image classification task
Classifier
nearest neighbor classifier
Bayesian classifier
linear classifier
Support vector machine classifier
neural network classifier
random forest
Adaboost
loss function
0-1 loss
Multi-class support vector machine loss
cross entropy loss
L1 loss
L2 loss
optimization
first order method
gradient descent
stochastic gradient descent
mini-batch stochastic gradient descent
second order method
Newton's method
BFGS
L-BFGS
training process
Data set partitioning
Data preprocessing
data augmentation
Underfitting and overfitting
Reduce computational complexity
Use weight regularization terms
Use droput regularization
Hyperparameter tuning
Model integration
linear classifier
definition
is a linear mapping that maps input image features to category scores
weight vector
The trained average is similar to the category after visualization
Considered as a template, the higher the weight of the input image and the template, the more similar it is.
decision boundary
The line with a score equal to 0 is the decision surface
loss function
A bridge between model performance and model parameters, guiding model parameter optimization
is a function that measures the degree of inconsistency between the predicted value of a given classifier and the true value, and its output is usually a non-negative real value
The non-negative real value output can be used as a feedback signal to adjust the classifier parameters to reduce the loss value corresponding to the current instance and improve the classification effect.
Multi-class support vector machine loss
regular term
To prevent the model from overfitting on the training set
Hyperparameters to control the size of the regular term
The main purpose is to encourage decentralized weights and use all dimensional features instead of relying heavily on a few dimensional features.
Parameter optimization
gradient descent
numerical gradient
Correctness check for parsing gradients
Analytical gradient
Newton-Leibniz formula
frequently used
stochastic gradient descent
Mini-batch stochastic gradient descent algorithm
iteration
Update the network structure parameters every iteration
batch size
The sample size used in one iteration
Number of rounds epoch
Indicates that all the samples in the training set have been passed
Data set partitioning
Classic division
Training set
Learning of classifier parameters for given hyperparameters
Validation set
for selecting hyperparameters
test set
Assess generalization ability
k-fold cross validation
Data preprocessing
Remove the mean
Guaranteed not to be affected by data range
Normalized
Guaranteed not to be affected by dimensions
decorrelate
This removes correlation from the data and reduces dimensionality to a certain extent.
Correlation - the covariance matrix of the data is a diagonal matrix
albino
decorrelated normalization