Artificial neural networks
Jump to navigation
Jump to search
wikipedia:Artificial neural networks
- TensorFlow library
- Typically, neurons are aggregated into layers.
- Backpropagation
Design[edit]
Design issues include deciding the number, type and connectedness of network layers, as well as the size of each and the connection type (full, pooling, ...).
Hyperparameters must also be defined as part of the design (they are not learned), governing matters such as how many neurons are in each layer, learning rate, step, stride, depth, receptive field and padding (for CNNs), etc
Related[edit]
See also[edit]
- AI: Autonomous driving, OpenAI, Google AI,Eliezer Yudkowsky, DeepMind, Computer Vision, Neural network, Vertex AI, Instadeep, Deep learning, Infogrid, Sapling, AssemblyAI, V7, MTIA, Yann LeCun, AI WiW, Salesforce AI, Pika, Amazon Q, LLM, Ollama, Cloud AI Developer Services, Hugging Face, Databricks
- Machine learning, Deep learning, AWS Sagemaker, PyTorch, Kubeflow, TensorFlow, Keras, Torch, Spark ML, Tinygrad, Apple Neural Engine, Scikit-learn, MNIST, MLOps, AutoML, ClearML, PostgresML, AWS Batch, Transformer, Diffusion, Backpropagation, JAX, Vector database, LLM, The Forrester Wave: AI/ML Platforms
- Artificial neural networks, Neuronal network (NN), CNN, Micrograd, NPU, ConvNet, AlexNet, GoogLeNet, Apache MXNet, Neural architecture search, DAG, Feedforward neural network, NeurIPS, Feature Pyramid Network, TPU, NPU, Apple Neural Engine (ANE), LLM, TFLOPS
Advertising: