Difference between revisions of "Backpropagation"
Jump to navigation
Jump to search
Tags: Mobile web edit, Mobile edit |
Tags: Mobile web edit, Mobile edit |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
[[wikipedia:Backpropagation]] [[ML]] algorithm computes the gradient in weight space of a feedforward neural network, with respect to a loss function | [[wikipedia:Backpropagation]] [[ML]] algorithm computes the gradient in weight space of a feedforward neural network, with respect to a loss function | ||
− | + | * [[Yann LeCun]] | |
* [[Chain rule]] | * [[Chain rule]] | ||
* [[N400]] | * [[N400]] | ||
* [[Optical computing]] | * [[Optical computing]] | ||
+ | |||
+ | == See also == | ||
* {{ML}} | * {{ML}} | ||
* {{NN}} | * {{NN}} |
Latest revision as of 05:53, 17 January 2024
wikipedia:Backpropagation ML algorithm computes the gradient in weight space of a feedforward neural network, with respect to a loss function
See also[edit]
- Machine learning, Deep learning, AWS Sagemaker, PyTorch, Kubeflow, TensorFlow, Keras, Torch, Spark ML, Tinygrad, Apple Neural Engine, Scikit-learn, MNIST, MLOps, AutoML, ClearML, PostgresML, AWS Batch, Transformer, Diffusion, Backpropagation, JAX, Vector database, LLM, The Forrester Wave: AI/ML Platforms
- Artificial neural networks, Neuronal network (NN), CNN, Micrograd, NPU, ConvNet, AlexNet, GoogLeNet, Apache MXNet, Neural architecture search, DAG, Feedforward neural network, NeurIPS, Feature Pyramid Network, TPU, NPU, Apple Neural Engine (ANE), LLM, TFLOPS
Advertising: