* Cantinho Satkeys

Refresh History
  • okapa:
    24 de Dezembro de 2025, 19:01
  • sacana10: A todos um feliz natal
    24 de Dezembro de 2025, 17:57
  • cereal killa: dgtgtr passei por ca para vos desejar feliz natal e familias  :smiles_natal:
    24 de Dezembro de 2025, 15:46
  • bruno mirandela: deso a todos um feliz natal
    24 de Dezembro de 2025, 14:31
  • FELISCUNHA: ghyt74   :34rbzg9:  e bom natal  :13arvoresnatalmagiagifs:
    24 de Dezembro de 2025, 10:15
  • tgh12: mikrotik
    24 de Dezembro de 2025, 07:49
  • tgh12: Spanish for Beginners: Spanish from 0 to Conversational
    24 de Dezembro de 2025, 04:57
  • JPratas: try65hytr Pessoal  4tj97u<z
    24 de Dezembro de 2025, 03:03
  • m1957: Para toda a equipa e membros deste fórum, desejo um Natal feliz e que o novo ano de 2026, seja muito próspero a todos os níveis.
    24 de Dezembro de 2025, 00:47
  • FELISCUNHA: Bom dia pessoal   :34rbzg9:
    22 de Dezembro de 2025, 10:35
  • j.s.: :13arvoresnatalmagiagifs:
    21 de Dezembro de 2025, 19:01
  • j.s.: try65hytr a todos  :smiles_natal: :smiles_natal:
    21 de Dezembro de 2025, 19:01
  • FELISCUNHA: ghyt74  49E09B4F  e bom fim de semana  4tj97u<z
    20 de Dezembro de 2025, 11:20
  • JPratas: try65hytr Pessoal  2dgh8i k7y8j0 classic dgf64y
    19 de Dezembro de 2025, 05:26
  • cereal killa: ghyt74 e boa semana de chuva e frio  RGG45wj erfb57j
    15 de Dezembro de 2025, 11:26
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  4tj97u<z
    14 de Dezembro de 2025, 09:28
  • j.s.: tenham um excelente fim de semana com muitas comprinhas  :13arvoresnatalmagiagifs: sdfgsdg
    13 de Dezembro de 2025, 14:58
  • j.s.: dgtgtr a todos  :smiles_natal:
    13 de Dezembro de 2025, 14:57
  • FELISCUNHA: dgtgtr   49E09B4F  e bom fim de semana   :34rbzg9:
    13 de Dezembro de 2025, 12:29
  • JPratas: try65hytr Pessoal  4tj97u<z 2dgh8i classic bve567o+
    12 de Dezembro de 2025, 05:34

Autor Tópico: Deep Learning for NLP - Part 1  (Lida 141 vezes)

0 Membros e 1 Visitante estão a ver este tópico.

Online mitsumi

  • Sub-Administrador
  • ****
  • Mensagens: 129146
  • Karma: +0/-0
Deep Learning for NLP - Part 1
« em: 13 de Agosto de 2021, 14:27 »
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.14 GB | Duration: 3h 16m

What you'll learn
Deep Learning for Natural Language Processing
Multi-Layered Perceptrons (MLPs)
Word embeddings
Recurrent Models: RNNs, LSTMs, GRUs and variants
DL for NLP
Requirements
Basics of machine learning
Description
This course is a part of "Deep Learning for NLP" Series. In this course, I will introduce basic deep learning concepts like multi-layered perceptrons, word embeddings and recurrent neural networks. These concepts form the base for good understanding of advanced deep learning models for Natural Language Processing.

The course consists of three sections.

In the first section, I will talk about Basic concepts in artificial neural networks like activation functions (like ramp, step, sigmoid, tanh, relu, leaky relu), integration functions, perceptron and back-propagation algorithms. I also talk about what is deep learning, how is it related to machine learning and artificial intelligence? Finally, I will talk about how to handle overfittting in neural network training using methods like regularization, early stopping and dropouts.

In the second section, I will talk about various kinds of word embedding methods. I will start with basic methods like Onehot encoding and Singular Value Decomposition (SVD). Next I will talk about the popular word2vec model including both the CBOW and Skipgram methods. Further, I will talk about multiple methods to make the softmax computation efficient. This will be followed by discussion on GloVe. As special word embedding topics I will cover Cross-lingual embeddings. Finally, I will also talk about sub-word embeddings like BPE (Byte Pair Encoding), wordPiece, SentencePiece which are popularly used for Transformer based models.

In the third session, I will start with general discussion on ngram models. Next I will briefly introduce the neural network language model (NNLM). Then we will spend quite some time understanding how RNNs work. We will also talk about RNN variants like BiRNNs, Deep BiRNNs. Then I will discuss the vanishing and exploding gradients problem. This will be followed by details of the LSTMs and GRUs architectures.

Who this course is for:
Beginners in deep learning
Python developers interested in data science concepts

Screenshots


Download link:
Só visivel para registados e com resposta ao tópico.

Only visible to registered and with a reply to the topic.

Links are Interchangeable - No Password - Single Extraction