Difference between revisions of "Neural Processing Unit (NPU)"
Jump to navigation
Jump to search
Line 4: | Line 4: | ||
* [[FSD Chip]] | * [[FSD Chip]] | ||
* [[M1]] (Nov 2020) 16 core 11 trillion | * [[M1]] (Nov 2020) 16 core 11 trillion | ||
+ | * [[M2]] (Nov 2020) 16 core 11 trillion | ||
+ | * [[M3]] (Nov 2020) 16 core 11 trillion | ||
+ | |||
Revision as of 10:05, 6 March 2024
wikipedia:Neural Processing Unit (AI accelerator)
- FSD Chip
- M1 (Nov 2020) 16 core 11 trillion
- M2 (Nov 2020) 16 core 11 trillion
- M3 (Nov 2020) 16 core 11 trillion
Related
See also
- Artificial neural networks, Neuronal network (NN), CNN, Micrograd, NPU, ConvNet, AlexNet, GoogLeNet, Apache MXNet, Neural architecture search, DAG, Feedforward neural network, NeurIPS, Feature Pyramid Network, TPU, NPU, Apple Neural Engine (ANE), LLM, TFLOPS
- CPU, GPU, NPU, TPU, DPU, Groq, Proliant, thread (
Pthreads
), processor, CPU socket, core, ARM , CPU Virtualization, Intel, AMD,nm
,lscpu
, AVX-512, Passthrough, CPU intensive, Graviton processor, Branch predictor, vCPU, SSE, Power
Advertising: