* Cantinho Satkeys

Refresh History
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  4tj97u<z
    Hoje às 11:43
  • j.s.: [link]
    05 de Julho de 2025, 16:31
  • j.s.: dgtgtr a todos  4tj97u<z
    05 de Julho de 2025, 16:31
  • j.s.: h7t45 ao convidado de Honra batatinha pela sua ajuda
    05 de Julho de 2025, 16:30
  • FELISCUNHA: ghyt74  pessoal   4tj97u<z
    04 de Julho de 2025, 11:58
  • JPratas: dgtgtr Pessoal  101041 Vamos Todos Ajudar na Manutenção do Forum, Basta 1 Euro a Cada Um  43e5r6
    03 de Julho de 2025, 19:02
  • cereal killa: Todos os anos e preciso sempre a pedir esmolas e um simples gesto de nem que seja 1€ que fosse dividido por alguns ajudava, uma coisa e certa mesmo continuando isto vai levar volta a como se tem acesso aos tópicos, nunca se quis implementar esta ideia mas quem não contribuir e basta 1 € por ano não terá acesso a sacar nada, vamos ver desenrolar disto mais ate dia 7,finalmente um agradecimento em nome do satkeys a quem já fez a sua doação, obrigada
    03 de Julho de 2025, 15:07
  • m1957: Por favor! Uma pequena ajuda, não deixem que o fórum ecerre. Obrigado!
    03 de Julho de 2025, 01:10
  • j.s.: [link]
    02 de Julho de 2025, 21:09
  • j.s.: h7t45 ao membro anónimo pela sua ajuda  49E09B4F
    02 de Julho de 2025, 21:09
  • j.s.: dgtgtr a todos  4tj97u<z
    01 de Julho de 2025, 17:18
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  4tj97u<z
    29 de Junho de 2025, 11:59
  • m1957: Foi de boa vontade!
    28 de Junho de 2025, 00:39
  • j.s.: passem f.v. por aqui [link]    h7t45
    27 de Junho de 2025, 17:20
  • j.s.: renovamos o nosso pedido para uma pequena ajuda para pagemento  do nosso forum
    27 de Junho de 2025, 17:19
  • j.s.: h7t45 aos convidados de honra Felizcunha e M1957 pela ajuda
    27 de Junho de 2025, 17:15
  • j.s.: dgtgtr a todos  4tj97u<z
    27 de Junho de 2025, 17:13
  • FELISCUNHA: ghyt74  pessoal  4tj97u<z
    27 de Junho de 2025, 11:51
  • JPratas: try65hytr A Todos  classic k7y8j0
    27 de Junho de 2025, 04:35
  • m1957: Por favor vaamos todos dar uma pequena ajuda, para não deixar encerrar o fórum! Obrigado.
    26 de Junho de 2025, 23:45

Autor Tópico: Machine Learning Series Decision Tree Algorithm in Python  (Lida 330 vezes)

0 Membros e 1 Visitante estão a ver este tópico.

Offline mitsumi

  • Sub-Administrador
  • ****
  • Mensagens: 121842
  • Karma: +0/-0
Machine Learning Series Decision Tree Algorithm in Python
« em: 09 de Junho de 2019, 16:39 »

Machine Learning Series: Decision Tree Algorithm in Python
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 1 Hour 16M | 246 MB
Genre: eLearning | Language: English

Dhiraj, a data scientist and machine learning evangelist, continues his teaching of machine learning algorithms by going into the decision tree algorithm in this video series. Learn all about this powerful machine learning algorithm in this video series covering these eight topics:

Introducing Decision Trees. This first video in the decision tree series introduces this powerful yet simple algorithm. The decision tree algorithm is used effectively with a series of conditional control statements like IF-ELSE. Understand key decision tree concepts including root node, decision node, leaf node, parent node, splitting, and pruning. There are two types of decision trees: Decision Tree with Categorical Variable and Decision Tree with Continuous Variable. There are also three techniques to generate a decision tree: ID3 (Iterative Dichotomiser 3), C4.5, and CART (Classification And Regression Trees).
Decision Tree Advantages and Disadvantages. This second video in the decision tree series covers both the advantages and disadvantages of using decision trees. Advantages include less effort in general for data preparation, no requirement for normalization of data, and missing values do not affect the process of building a decision tree. Disadvantages include difficulty managing volatile data, calculations can be complex, and it can take substantial time to train the model.
Decision Tree Regression. This third video in the decision tree series explains how to perform decision tree regression. Decision tree regression observes features of an object and trains a model to predict data to produce meaningful continuous output. A discrete target example is predicting weather on a particular day. A continuous target example is predicting profit generated from sales. Understand the difference between decision tree regression and linear regression.
Decision Tree Classification. This fourth video in the decision tree series focuses on the decision tree classifier. The data set is split into subsets based on an attribute value test, and subsets are continued to be created in a process called recursive partitioning. Understand the difference between decision tree classification and linear regression.
Decision Tree Information Gain. This fifth video in the decision tree series explains information gain in depth. Information gain is a measure of how much information a feature in a given dataset gives with respect to class. Also learn all about entropy, which plays an essential role in deciding how a decision tree will split data.
Building a Decision Tree in Python. This sixth video in the decision tree series shows you hands-on how to create a decision tree using Python.
Decision Tree Prediction. This seventh video in the decision tree series explains how to create sample input for the model, use this sample input to have the model make a prediction, and then compare the precision to the actual output. We will be using the scikit-learn predict() function.
Decision Tree Evaluation. This eighth video in the decision tree series explains how to evaluate a decision tree model. Once the machine learning model has been evaluated, we can use the feedback to improve the model until our model produces the desired accuracy. We will use a Confusion Matrix to evaluate our decision tree model.
           

               
 
Download link:
Só visivel para registados e com resposta ao tópico.

Only visible to registered and with a reply to the topic.

Links are Interchangeable - No Password - Single Extraction