Learning Deep Learning: From Perceptron to Large Language Models (Video Course)
Learning Deep Learning: From Perceptron to Large Language Models (Video Course)
English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 110 Lessons (13h 23m) | 2.75 GB
Learning Deep Learning: From Perceptron to Large Language Models (Video Course): A complete guide to deep learning for artificial intelligence
Deep learning (DL) is a key component of today’s exciting advances in machine learning and artificial intelligence. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this Learning Deep Learning: From Perceptron to Large Language Models (Video Course) is ideal for developers, data scientists, analysts, and othersincluding those with no prior machine learning or statistics experience.
Learn How To:
- Apply core concepts of perceptrons, gradient-based learning, sigmoid neurons, and backpropagation
- Utilize DL frameworks to make it easier to develop more complicated and useful neural networks
- Utilize convolutional neural networks (CNNs) to perform image classification and analysis
- Apply recurrent neural networks (RNNs) and long short-term memory (LSTM) to text and other variable-length sequences
- Build a natural language translation application using sequence-to-sequence networks based on the transformer architecture
- Use the transformer architecture for other natural language processing (NLP) tasks, and how to engineer prompts for large language models (LLM)
- Combine image and text data and build multimodal networks, including an image captioning application
After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Magnus Ekman shows how to use them to build advanced architectures, including the Transformer. He describes how these concepts are used to build modern networks for computer vision and natural language processing (NLP), including large language models and multimodal networks.