Difference between revisions of "Backpropagation"
Jump to navigation
Jump to search
Tags: Mobile web edit, Mobile edit |
Tags: Mobile web edit, Mobile edit |
||
Line 1: | Line 1: | ||
− | [[wikipedia:Backpropagation]] [[ML]] algorithm | + | [[wikipedia:Backpropagation]] [[ML]] algorithm computes the gradient in weight space of a feedforward neural network, with respect to a loss function |
Revision as of 01:11, 21 November 2023
wikipedia:Backpropagation ML algorithm computes the gradient in weight space of a feedforward neural network, with respect to a loss function
- Machine learning, Deep learning, AWS Sagemaker, PyTorch, Kubeflow, TensorFlow, Keras, Torch, Spark ML, Tinygrad, Apple Neural Engine, Scikit-learn, MNIST, MLOps, AutoML, ClearML, PostgresML, AWS Batch, Transformer, Diffusion, Backpropagation, JAX, Vector database, LLM, The Forrester Wave: AI/ML Platforms
- Artificial neural networks, Neuronal network (NN), CNN, Micrograd, NPU, ConvNet, AlexNet, GoogLeNet, Apache MXNet, Neural architecture search, DAG, Feedforward neural network, NeurIPS, Feature Pyramid Network, TPU, NPU, Apple Neural Engine (ANE), LLM, TFLOPS
Advertising: