* Cantinho Satkeys

Refresh History
  • okapa:
    24 de Dezembro de 2025, 19:01
  • sacana10: A todos um feliz natal
    24 de Dezembro de 2025, 17:57
  • cereal killa: dgtgtr passei por ca para vos desejar feliz natal e familias  :smiles_natal:
    24 de Dezembro de 2025, 15:46
  • bruno mirandela: deso a todos um feliz natal
    24 de Dezembro de 2025, 14:31
  • FELISCUNHA: ghyt74   :34rbzg9:  e bom natal  :13arvoresnatalmagiagifs:
    24 de Dezembro de 2025, 10:15
  • tgh12: mikrotik
    24 de Dezembro de 2025, 07:49
  • tgh12: Spanish for Beginners: Spanish from 0 to Conversational
    24 de Dezembro de 2025, 04:57
  • JPratas: try65hytr Pessoal  4tj97u<z
    24 de Dezembro de 2025, 03:03
  • m1957: Para toda a equipa e membros deste fórum, desejo um Natal feliz e que o novo ano de 2026, seja muito próspero a todos os níveis.
    24 de Dezembro de 2025, 00:47
  • FELISCUNHA: Bom dia pessoal   :34rbzg9:
    22 de Dezembro de 2025, 10:35
  • j.s.: :13arvoresnatalmagiagifs:
    21 de Dezembro de 2025, 19:01
  • j.s.: try65hytr a todos  :smiles_natal: :smiles_natal:
    21 de Dezembro de 2025, 19:01
  • FELISCUNHA: ghyt74  49E09B4F  e bom fim de semana  4tj97u<z
    20 de Dezembro de 2025, 11:20
  • JPratas: try65hytr Pessoal  2dgh8i k7y8j0 classic dgf64y
    19 de Dezembro de 2025, 05:26
  • cereal killa: ghyt74 e boa semana de chuva e frio  RGG45wj erfb57j
    15 de Dezembro de 2025, 11:26
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  4tj97u<z
    14 de Dezembro de 2025, 09:28
  • j.s.: tenham um excelente fim de semana com muitas comprinhas  :13arvoresnatalmagiagifs: sdfgsdg
    13 de Dezembro de 2025, 14:58
  • j.s.: dgtgtr a todos  :smiles_natal:
    13 de Dezembro de 2025, 14:57
  • FELISCUNHA: dgtgtr   49E09B4F  e bom fim de semana   :34rbzg9:
    13 de Dezembro de 2025, 12:29
  • JPratas: try65hytr Pessoal  4tj97u<z 2dgh8i classic bve567o+
    12 de Dezembro de 2025, 05:34

Autor Tópico: NLP: Natural Language Processing ML Model Deployment at AWS (Updated 9/2020)  (Lida 208 vezes)

0 Membros e 1 Visitante estão a ver este tópico.

Offline mitsumi

  • Sub-Administrador
  • ****
  • Mensagens: 129146
  • Karma: +0/-0

NLP: Natural Language Processing ML Model Deployment at AWS
Duration: 9h37m | .MP4 1280x720, 30 fps(r) | AAC, 44100 Hz, 2ch | 4.04 GB
Genre: eLearning | Language: English
Build & Deploy BERT, DistilBERT, FastText NLP Models in Production with Flask, uWSGI, and NGINX at AWS EC2.

What you'll learn
Complete End to End NLP Application
How to work with BERT in Google Colab
How to use BERT for Text Classification
Deploy Production Ready ML Model
Fine Tune and Deploy ML Model with Flask
Deploy ML Model in Production at AWS
Deploy ML Model at Ubuntu and Windows Server
DistilBERT vs BERT
Optimize your NLP Code
You will learn how to develop and deploy FastText model on AWS
Learn Multi-Label and Multi-Class classification in NLP

Requirements
Introductory knowledge of NLP
Comfortable in Python, Keras, and TensorFlow 2
Basic Elementary Mathematics

Description
Are you ready to kickstart your  Advanced NLP course? Are you ready to deploy your machine learning models in production at AWS? You will learn each and every steps on how to build and deploy your ML model on a robust and secure server at AWS.

Prior knowledge of python and Data Science is assumed. If you are AN absolute beginner in Data Science, please do not take this course. This course is made for medium or advanced level of Data Scientist.

What is BERT?

BERT is a method of pre-training language representations, meaning that we train a general-purpose "language understanding" model on a large text corpus (like Wikipedia), and then use that model for downstream NLP tasks that we care about (like question answering). BERT outperforms previous methods because it is the first unsupervised, deeply bidirectional system for pre-training NLP.

Unsupervised means that BERT was trained using only a plain text corpus, which is important because an enormous amount of plain text data is publicly available on the web in many languages.

Why is BERT so revolutionary?

Not only is it a framework that has been pre-trained with the biggest data set ever used, but it is also remarkably easy to adapt to different NLP applications, by adding additional output layers. This allows users to create sophisticated and precise models to carry out a wide variety of NLP tasks.

Here is what you will learn in this course
Notebook Setup and What is BERT.
Data Preprocessing.
BERT Model Building and Training.
BERT Model Evaluation and Saving.
DistilBERT Model Fine Tuning and Deployment
Deploy Your ML Model at AWS with Flask Server
Deploy Your Model at Both Windows and Ubuntu Machine
And so much more!

All these things will be done on Google Colab which means it doesn't matter what processor and computer you have. It is super easy to use and plus point is that you have Free GPU to use in your notebook.

Who this course is for:
AI Students eager to learn advanced techniques of text processing
Data Science enthusiastic to build end-to-end NLP Application
Anyone wants to strengthen NLP skills
Anyone want to deploy ML Model in Production
Data Scientists who want to learn Production Ready ML Model Deployment

Download link:
Só visivel para registados e com resposta ao tópico.

Only visible to registered and with a reply to the topic.

Links are Interchangeable - No Password - Single Extraction