Matt Oswalt
Codex
Matt Oswalt
Blogs
All Categories
Rust
General Programming
Systems
Machine Learning
Personal
Codex
Portfolio
Sponsor Me!
Github
Twitter
LinkedIn
YouTube
Facebook
Bluesky
RSS
Search
Search
Cancel
Loading search index…
No recent searches
No results for "
Query here
"
Title here
Date here
Summary here
Linux
File Descriptors
Networking
eBPF
Sockets
Machine Learning
Deep Learning
Machine Learning
Glossary
Math
Glossary
Rust
Common Traits
Ownership
Video
GoPro
Cheat Sheets
This Glossary
Convolutional Neural Network
Dropout
Gradient Descent
Logit
Loss functions
Mini-Batch Gradient Descent
Momentum
Non-Linearity
Normalization
Onehot Encoding
Overfitting
Regularization
ReLU
Sigmoid
tanh
Tensor
ReLU
← Back to Glossary
Activation Functions in Neural Networks
ReLU Function in Deep Learning
Referenced in
Classifying MNIST Handwritten Digits With a Fully Connected Neural Network
Prev
Onehot Encoding
Next
Sigma Notation