Quizermania Logo

Introduction to Machine Learning NPTEL Week 3 Solutions NPTEL 2023

This set of MCQ(multiple choice questions) focuses on the Introduction to Machine Learning NPTEL Week 3 Solutions NPTEL 2023 .

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

Course layout

Answers COMING SOON! Kindly Wait!

Week 1:  Assignment answers Week 2: Assignment answers Week 3: Assignment answers Week 4: Assignment answers Week 5: Assignment answers Week 6: Assignment answers Week 7: Assignment answers Week 8: Assignment answers Week 9: Assignment answers Week 10: Assignment answers Week 11: Assignment answers Week 12: Assignment answers

NOTE:  You can check your answer immediately by clicking show answer button. Introduction to Machine Learning NPTEL Week 3 Solutions Assignment Solution” contains 10 questions.

Now, start attempting the quiz.

Introduction to Machine learning NPTEL 2023 Week 3 Solutions

Q1. Fill in the blanks: K-Nearest Neighbor is a _______, ________ algorithm

a) Non-parametric, eager b) Parametric, eager c) Non-parametric, lazy d) Parametric, lazy

Q2. You have been given the following 2 statements. Find out which of these options is/are true in the case of k-NN. (i) In case of very large value of k, we may include points from other classes in to the neighborhood. (ii) In case of too small value of k, the algorithm is very sensitive to noise.

a) (i) is True and (ii) is False b) (i) is False and (ii) is True c) Both are True d) Both are False

Q3. State whether the statement is True/False: k-NN algorithm does more computation on test time rather than train time.

a) True b) False

Introduction to Machine Learning NPTEL Week 3 Solutions

Q4. Suppose you are given the following images(1 represents the left image, 2 represents the middle and 3 represents the right). Now you task is to find out the value of k in k-NN in each of the images shown below. Here k1 is for 1st, k2 is for 2nd and k3 is for 3rd figure.

a) k1 > k2 > k3 b) k1 < k2 > k3 c) k1 < k2 < k3 d) None of these

Q5. Which of the following necessitates feature reduction in machine learning?

a) Irrelevant and redundant features b) Limited training data c) Limited computational resources d) All of the above

Q6. Suppose, you have given the following data where x and y are the 2 input variables and Class is the dependent variable.

a) + Class b) – Class c) Can’t say d) None of these

Q7. What is the optimum number of principal components in the below figure?

a) 10 b) 20 c) 30 d) 40

Q8. Suppose we are using dimensionality reduction as pre-processing technique, i.e, instead of using all the features, we reduce the data to k dimensions with PCA. And then use these PCA projections as our features. Which of the following statements is correct? Choose which of the options is correct?

a) Higher value of ‘k’ means more regularization b) Higher value of ‘k’ means less regularization

Q9. In collaborative filtering-based recommendation, the items are recommended based on:

a) Similar users b) Similar items c) Both of the above d) None of the above

Q10. The major limitation of collaborative filtering is:

a) Cold start b) Overspecialization c) None of the above

Q11. Consider the figures below. Which figure shows the most probable PCA component directions for the data points?

a) A b) B c) C d) D

Q12. Suppose that you wish to reduce the number of dimensions of a given data to dimensions using PCA. Which of the following statement is correct?

a) Higher means more regularization b) Higher means less regularization c) Can’t say

Q13. Suppose you are given 7 plots 1-7 (left to right) and you want to compare Pearson correlation coefficients between variables of each plot. Which fo the following is true? 1. 1 < 2<3<4 2. 1>2>3>4 3. 7<6<5<4 4. 7>6>5>4

a) 1 and 3 b) 2 and 3 c) 1 and 4 d) 2 and 4

Q14. Imagine you are dealing with 20 class classification problem. What is the maximum number of discriminant vectors that can be produced by LDA?

a) 20 b) 19 c) 21 d) 10

Q15. In which of the following situations collaborative filtering algorithm is appropriate?

a) You manage an online bookstore and you have the book ratins from many users. For each user, you want to recommend other books he/she will like based on her previous ratins and other users’ ratings. b) You manage an online bookstore and you have the book raings from many users. You want to predict the expected sales volume(No of books sold) as a function of average rating of a book. c) Both A and B d) None of the above

Q1. Which of the following is false about a logistic regression based classifier?

a)  The logistic function is non-linear in the weights b) The logistic function is linear in the weights c) he decision boundary is non-linear in the weights d) The decision boundary is linear in the weights

Answer: a,c

Q2. Consider the case where two classes follow Gaussian distribution which are centered at (3, 9) and (−3, 3) and have identity covariance matrix. Which of the following is the separating decision boundary using LDA assuming the priors to be equal?

a) y−x=3 b) x+y=3 c) x+y=6 d) both (b) and (c) e) None of the above f) Can not be found from the given information

Q3. Consider the following relation between a dependent variable and an independent variable identified by doing simple linear regression. Which among the following relations between the two variables does the graph indicate?

introduction to machine learning nptel assignment 3 answers

a)  as the independent variable increases, so does the dependent variable b) as the independent variable increases, the dependent variable decreases c) if an increase in the value of the dependent variable is observed, then the independent variable will show a corresponding increase d) if an increase in the value of the dependent variable is observed, then the independent variable will show a corresponding decrease e)  the dependent variable in this graph does not actually depend on the independent variable f) none of the above

Q4. Given the following distribution of data points:

introduction to machine learning nptel assignment 3 answers

What method would you choose to perform Dimensionality Reduction?

a) Linear Discriminant Analysis b) Principal Component Analysis

Q5. In general, which of the following classification methods is the most resistant to gross outliers?

a) Quadratic Discriminant Analysis (QDA) b) Linear Regression c) Logistic regression d) Linear Discriminant Analysis (LDA)

Q6. Suppose that we have two variables, X and Y (the dependent variable). We wish to find the relation between them. An expert tells us that relation between the two has the form Y=m+X2+c=+2+. Available to us are samples of the variables X and Y. Is it possible to apply linear regression to this data to estimate the values of m and c?

a) no b) yes c) insufficient information

Q7. In a binary classification scenario where x is the independent variable and y is the dependent variable, logistic regression assumes that the conditional distribution y|x| follows a

a) Bernoulli distribution  b) binomial distribution  c) normal distribution  d) exponential distribution

Q8. Consider the following data:

introduction to machine learning nptel assignment 3 answers

Assuming that you apply LDA to this data, what is the estimated covariance matrix?

a) [1.8750.31250.31250.9375][1.8750.31250.31250.9375] b) [2.50.41670.41671.25] c) [1.8750.31250.31251.2188] d) [2.50.41670.41671.625] e) [3.251.16671.16672.375] f) [2.43750.8750.8751.7812] g) None of these

Q9. Given the following 3D input data, identify the principal component.

introduction to machine learning nptel assignment 3 answers

(Steps: center the data, calculate the sample covariance matrix, calculate the eigenvectors and eigenvalues, identify the principal component)

a) ⎢−0.10220.00180.9948⎤⎦⎥ b) ⎡⎣⎢0.5742−0.81640.0605⎤⎦⎥  c) ⎢0.57420.81640.0605⎤⎦⎥ d) ⎡⎣⎢−0.57420.81640.0605⎤⎦ e) ⎡⎣⎢0.81230.57740.0824⎤⎦⎥ f) None of the above

Q10. For the data given in the previous question, find the transformed input along the first two principal components.

a) ⎡⎣⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢0.6100−0.4487−1.26511.33450.5474−1.0250−1.26721.5142−0.0196−0.1181−0.11630.5702−0.72570.27270.1724−0.0355⎤⎦⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥ b) ⎡⎣⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢−0.1817−1.2404−2.05680.5428−0.2443−1.8167−2.05890.72250.89440.79590.79771.48420.18841.18681.08640.8785⎤⎦⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥ c) ⎡⎣⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢−6.2814−4.3143−3.7368−1.79502.29173.52894.91865.38830.6100−0.4487−1.26511.33450.5474−1.0250−1.26721.5142⎤⎦⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥  d) ⎡⎣⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢1.47213.43924.01665.958410.045111.282312.672013.1418−0.1817−1.2404−2.05680.5428−0.2443−1.8167−2.05890.7225⎤⎦⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥ e) None of the above

<< Previous- Introduction to Machine Learning Week 2 Assignment Solutions

>> Next- Introduction to Machine Learning Week 4 Assignment Solutions

For discussion about any question, join the below comment section. And get the solution of your query. Also, try to share your thoughts about the topics covered in this particular quiz.

Checkout for more NPTEL Courses: Click Here!

Related Posts

Article : english quiz, article | english quiz, conditional sentences | english grammar, software testing & quality assurance | assignment solutions, aiml basic mcq (multiple choice questions), verbal ability practice set | mcq, leave a comment cancel reply.

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • 1st Central Law Reviews: Expert Legal Analysis & Insights

Introduction To Machine Learning IIT-KGP Nptel Week 3 Assignment Answers

Are you looking for Nptel Introduction To Machine Learning IIT-KGP Week 3 Answers 2024 ? This guide offers comprehensive assignment solutions tailored to help you master key machine learning concepts such as supervised learning, regression, and classification.

Course Link: Click Here

Table of Contents

Introduction To Machine Learning IIT-KGP Nptel Week 3 Assignment Answers

Introduction To Machine Learning IIT-KGP Week 3 Answers (July-Dec 2024)

Q1. What will be the class of a new data point x1=1 and x2=1 in 5-NN (k nearest neighbour with k=5) using euclidean distance measure? A. + Class B. – Class C. Cannot be determined

Answer: A. + Class

Q2. Imagine you are dealing with a 10 class classification problem. What is the maximum number of discriminant vectors that can be produced by LDA? A. 20 B. 14 C. 9 D. 10

Answer: C. 9

For answers or latest updates join our telegram channel: Click here to join

These are Introduction To Machine Learning IIT-KGP Week 3 Answers

Q3 .Fill in the blanks: K-Nearest Neighbor is a algorithm A. Non-parametric, eager B. Parametric, eager C. Non-parametric, lazy D. Parametric, lazy

Answer: C. Non-parametric, lazy

Q4 .Which of the following statements is True about the KNN algorithm? A. KNN algorithm does more computation on test time rather than train time. B. KNN algorithm does lesser computation on test time rather than train time. C. KNN algorithm does an equal amount of computation on test time and train time. D. None of these.

Answer: A. KNN algorithm does more computation on test time rather than train time.

Q5. Which of the following necessitates feature reduction in machine learning?

Limited computational resources. A. 1 only B. 2 only C. 1 and 2 only D. 1, 2 and 3

Answer: D. 1, 2 and 3

Q6 .When there is noise in data, which of the following options would improve the performance of the k-NN algorithm? A. Increase the value of k B. Decrease the value of k C. Changing value of k will not change the effect of the noise D. None of these

Answer : A. Increase the value of k

Q7.Find the value of the Pearson’s correlation coefficient of X and Y from the data in the following table.

Α. 0.47 B. 0.68 C. 1 D. 0.33

Answer: B. 0.68

Q8. Which of the following statements is/are true about PCA?

  • PCA is a supervised method
  • It identifies the directions that data have the largest variance
  • Maximum number of principal components <= number of features
  • All principal components are orthogonal to each other

A. Only 2 B. 1, 3 and 4 C. 1, 2 and 3 D. 2, 3 and 4

Answer: D. 2, 3 and 4

Q9. In user-based collaborative filtering based recommendation, the items are recommended based on: A. Similar users B. Similar items C. Both of the above D. None of the above

Answer: A. Similar users

Q10 .Identify whether the following statement is true or false? “Linear Discriminant Analysis (LDA) is a supervised method” A. TRUE B. FALSE

Answer: A. TRUE

All weeks of Introduction to Machine Learning: Click Here

More Nptel Courses: https://progiez.com/nptel-assignment-answers

Introduction To Machine Learning IIT-KGP Week 3 Answers (July-Dec 202 2)

Course Name: Introduction To Machine Learning IITKGP

Link to Enroll:  Click Here

Q1. Suppose, you have given the following data where x and y are the 2 input variables and Class is the dependent variable. Suppose, you want to predict the class of new data point x=1 and y=1 using euclidean distance in 3-NN. To which class the new data point belongs to?

a. +Class b. -Class c. Can’t say d. None of these

Answer: b. – Class

Q2. Imagine you are dealing with a 10 class classification problem. What is the maximum number of discriminant vectors that can be produced by LDA? a. 20 b. 14 c. 9 d. 10

Answer: c. 9

Q3. Fill in the blanks: K-Nearest Neighbor is a_ algorithm a. Non-parametric, eager b. Parametric, eager c. Non-parametric, lazy d. Parametric, lazy

Answer: c. Non-parametric, lazy

Q4. Which of the following statements is True about the KNN algorithm? a. KNN algorithm does more computation on test time rather than train time. b. KNN algorithm does lesser computation on test time rather than train time. c. KNN algorithm does an equal amount of computation on test time and train time. d. None of these.

Answer: a. KNN algorithm does more computation on test time rather than train time.

Q5. Which of the following necessitates feature reduction in machine learning? A. Irrelevant and redundant features B. Curse of dimensionality C. Limited computational resources. D. All of the above

Answer: d. All of the above

Q6. When there is noise in data, which of the following options would improve the perfomance of the KNN algorithm? a. Increase the value of k b. Decrease the value of k c. Changing value of k will not change the effect of the noise d. None of these

Answer: a. Increase the value of k

Q7. Find the value of the Pearson’s correlation coefficient of X and Y from the data in the following table. a. 0.47 b. 0.68 c. 1 d. 0.33

Answer: b. 0.68

Q8. Which of the following is false about PCA? a. PCA is a supervised method b. It identifies the directions that data have the largest variance c. Maximum number of principal components = number of features d. All principal components are othogonal to each other

Answer: a. PCA is a supervised method

Q9. In user-based collaborative filtering based recommendation, the items are recommended based on : a. Similar users b. Similar items c. Both of the above d. None of the above

Answer: a. Similar users

Q10. Identify whether the following statement is true or false? “PCA can be used for projecting and visualizing data in lower dimensions.” a. True b. False

Answer: a. True

Introduction to Machine Learning Nptel Week 2  Answers

DBC Itanagar

All India News

NPTEL Introduction to Machine Learning Week 3 Assignment Answers 2024

admin

1. For a two-class problem using discriminant functions (δk – discriminant function for class k), where is the separating hyperplane located?

  • Where δ1>δ2
  • Where δ1<δ2
  • Where δ1=δ2
  • Where δ1+δ2=1

2. Given the following dataset consisting of two classes, A and B, calculate the prior probability of each class.

w3q2

What are the prior probabilities of class A and class B?

  • P(A)=0.5,P(B)=0.5
  • P(A)=0.625,P(B)=0.375
  • P(A)=0.375,P(B)=0.625
  • P(A)=0.6,P(B)=0.4

3. In a 3-class classification problem using linear regression, the output vectors for three data points are [0.8, 0.3, -0.1], [0.2, 0.6, 0.2], and [-0.1, 0.4, 0.7]. To which classes would these points be assigned?

4. If you have a 5-class classification problem and want to avoid masking using polynomial regression, what is the minimum degree of the polynomial you should use?

5. Consider a logistic regression model where the predicted probability for a given data point is 0.4. If the actual label for this data point is 1, what is the contribution of this data point to the log-likelihood?

6. What additional assumption does LDA make about the covariance matrix in comparison to the basic assumption of Gaussian class conditional density?

  • The covariance matrix is diagonal
  • The covariance matrix is identity
  • The covariance matrix is the same for all classes
  • The covariance matrix is different for each class

7. What is the shape of the decision boundary in LDA?

  • Can not be determined

image 9

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Latest News

NPTEL Introduction to Operating Systems Week Assignment Answers

NPTEL Introduction to Operating Systems Week 8 Assignment Answers 2024

NPTEL Learning Analytics Tools Week 1 Assignment Answers

NPTEL Learning Analytics Tools Week 8 Assignment Answers 2024

NPTEL Conservation Geography Week 2 Assignment Answers

NPTEL Conservation Geography Week 8 Assignment Answers 2024

NPTEL Theory of Computation Week 2 Assignment Answers

NPTEL Theory of Computation Week 8 Assignment Answers 2024

NPTEL Wild Life Ecology Week 1 Assignment Answers 2024

NPTEL Wild Life Ecology Week 8 Assignment Answers 2024

introduction to machine learning nptel assignment 3 answers

Sign in to your account

Username or Email Address

Remember Me

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications You must be signed in to change notification settings

NPTEL Assignment Answers and Solutions 2024 (July-Dec). Get Answers of Week 1 2 3 4 5 6 7 8 8 10 11 12 for all courses. This guide offers clear and accurate answers for your all assignments across various NPTEL courses

progiez/nptel-assignment-answers

Folders and files.

NameName
203 Commits

Repository files navigation

Nptel assignment answers 2024 with solutions (july-dec), how to use this repo to see nptel assignment answers and solutions 2024.

If you're here to find answers for specific NPTEL courses, follow these steps:

Access the Course Folder:

  • Navigate to the folder of the course you are interested in. Each course has its own folder named accordingly, such as cloud-computing or computer-architecture .

Locate the Weekly Assignment Files:

  • Inside the course folder, you will find files named week-01.md , week-02.md , and so on up to week-12.md . These files contain the assignment answers for each respective week.

Select the Week File:

  • Click on the file corresponding to the week you are interested in. For example, if you need answers for Week 3, open the week-03.md file.

Review the Answers:

  • Each week-XX.md file provides detailed solutions and explanations for that week’s assignments. Review these files to find the information you need.

By following these steps, you can easily locate and use the assignment answers and solutions for the NPTEL courses provided in this repository. We hope this resource assists you in your studies!

List of Courses

Here's a list of courses currently available in this repository:

  • Artificial Intelligence Search Methods for Problem Solving
  • Cloud Computing
  • Computer Architecture
  • Cyber Security and Privacy
  • Data Science for Engineers
  • Data Structure and Algorithms Using Java
  • Database Management System
  • Deep Learning for Computer Vision
  • Deep Learning IIT Ropar
  • Digital Circuits
  • Ethical Hacking
  • Introduction to Industry 4.0 and Industrial IoT
  • Introduction to Internet of Things
  • Introduction to Machine Learning IIT KGP
  • Introduction to Machine Learning
  • Introduction to Operating Systems
  • ML and Deep Learning Fundamentals and Applications
  • Problem Solving Through Programming in C
  • Programming DSA Using Python
  • Programming in Java
  • Programming in Modern C
  • Python for Data Science
  • Soft Skill Development
  • Soft Skills
  • Software Engineering
  • Software Testing
  • The Joy of Computation Using Python
  • Theory of Computation

Note: This repository is intended for educational purposes only. Please use the provided answers as a guide to better understand the course material.

📧 Contact Us

For any queries or support, feel free to reach out to us at [email protected] .

🌐 Connect with Progiez

Website

⭐️ Follow Us

Stay updated with our latest content and updates by following us on our social media platforms!

🚀 About Progiez

Progiez is an online educational platform aimed at providing solutions to various online courses offered by NPTEL, Coursera, LinkedIn Learning, and more. Explore our resources for detailed answers and solutions to enhance your learning experience.

Disclaimer: This repository is intended for educational purposes only. All content is provided for reference and should not be submitted as your own work.

Contributors 3

  • Computer Science and Engineering
  • NOC:Introduction to Machine Learning (Video) 
  • Co-ordinated by : IIT Kharagpur
  • Available from : 2016-09-08
  • Intro Video
  • Lecture 01: Introduction
  • Lecture 02: Different Types of Learning
  • Lecture 03: Hypothesis Space and Inductive Bias
  • Lecture 04: Evaluation and Cross-Validation
  • Lecture 05 : Linear Regression
  • Lecture 06 : Introduction to Decision Trees
  • Lecture 07 : Learning Decision Tree
  • Lecture 08 : Overfitting
  • Lecture 9: Python Exercise on Decision Tree and Linear Regression
  • Tutorial II
  • Lecture 12: k-Nearest Neighbour
  • Lecture 13: Feature Selection
  • Lecture 14: Feature Extraction
  • Lecture 15: Collaborative Filtering
  • Lecture 16: Python Exercise on kNN and PCA
  • Lecture 17: Tutorial III
  • Lecture 18: Bayesian Learning
  • Lecture 19: Naive Bayes
  • Lecture 20 : Bayesian Network
  • Lecture 21: Python Exercise on Naive Bayes
  • Lecture 22: Tutorial IV
  • Lecture 23 : Logistic Regression
  • Lecture 24: Introduction Support Vector Machine
  • Lecture 25: SVM : The Dual Formulation
  • Lecture 26: SVM : Maximum Margin with Noise
  • Lecture 27: Nonlinear SVM and Kernel Function
  • Lecture 28: SVM : Solution to the Dual Problem
  • Lecture 29: Python Exercise on SVM
  • Lecture 30: Introduction
  • Lecture 31: Multilayer Neural Network
  • Lecture 32 : Neural Network and Backpropagation Algorithm
  • Lecture 33: Deep Neural Network
  • Lecture 34: Python Exercise on Neural Network
  • Lecture 35: Tutorial VI
  • Lecture 36: Introduction to Computational Learning Theory
  • Lecture 37: Sample Complexity : Finite Hypothesis Space
  • Lecture 38: VC Dimension
  • Lecture 39 : Introduction to Ensembles
  • Lecture 40 : Bagging and Boosting
  • Lecture 41 : Introduction to Clustering
  • Lecture 42 : Kmeans Clustering
  • Lecture 43: Agglomerative Hierarchical Clustering
  • Lecture 44: Python Exercise on kmeans clustering
  • Watch on YouTube
  • Assignments
  • Download Videos
  • Transcripts
Module NameDownload
noc19_cs52_assignment_Week_1
noc19_cs52_assignment_Week_2
noc19_cs52_assignment_Week_3
noc19_cs52_assignment_Week_4
noc19_cs52_assignment_Week_5
noc19_cs52_assignment_Week_6
noc19_cs52_assignment_Week_7
noc19_cs52_assignment_Week_8
Sl.No Chapter Name MP4 Download
1Lecture 01: Introduction
2Lecture 02: Different Types of Learning
3Lecture 03: Hypothesis Space and Inductive Bias
4Lecture 04: Evaluation and Cross-Validation
5Tutorial I
6Lecture 05 : Linear Regression
7Lecture 06 : Introduction to Decision Trees
8Lecture 07 : Learning Decision Tree
9Lecture 08 : Overfitting
10Lecture 9: Python Exercise on Decision Tree and Linear Regression
11Tutorial II
12Lecture 12: k-Nearest Neighbour
13Lecture 13: Feature Selection
14Lecture 14: Feature Extraction
15Lecture 15: Collaborative Filtering
16Lecture 16: Python Exercise on kNN and PCA
17Lecture 17: Tutorial III
18Lecture 18: Bayesian Learning
19Lecture 19: Naive Bayes
20Lecture 20 : Bayesian Network
21Lecture 21: Python Exercise on Naive Bayes
22Lecture 22: Tutorial IV
23Lecture 23 : Logistic Regression
24Lecture 24: Introduction Support Vector Machine
25Lecture 25: SVM : The Dual Formulation
26Lecture 26: SVM : Maximum Margin with Noise
27Lecture 27: Nonlinear SVM and Kernel Function
28Lecture 28: SVM : Solution to the Dual Problem
29Lecture 29: Python Exercise on SVM
30Lecture 30: Introduction
31Lecture 31: Multilayer Neural Network
32Lecture 32 : Neural Network and Backpropagation Algorithm
33Lecture 33: Deep Neural Network
34Lecture 34: Python Exercise on Neural Network
35Lecture 35: Tutorial VI
36Lecture 36: Introduction to Computational Learning Theory
37Lecture 37: Sample Complexity : Finite Hypothesis Space
38Lecture 38: VC Dimension
39Lecture 39 : Introduction to Ensembles
40Lecture 40 : Bagging and Boosting
41Lecture 41 : Introduction to Clustering
42Lecture 42 : Kmeans Clustering
43Lecture 43: Agglomerative Hierarchical Clustering
44Lecture 44: Python Exercise on kmeans clustering
Sl.No Chapter Name English
1Lecture 01: Introduction
2Lecture 02: Different Types of Learning
3Lecture 03: Hypothesis Space and Inductive Bias
4Lecture 04: Evaluation and Cross-Validation
5Tutorial I
6Lecture 05 : Linear Regression
7Lecture 06 : Introduction to Decision Trees
8Lecture 07 : Learning Decision Tree
9Lecture 08 : Overfitting
10Lecture 9: Python Exercise on Decision Tree and Linear Regression
11Tutorial II
12Lecture 12: k-Nearest Neighbour
13Lecture 13: Feature Selection
14Lecture 14: Feature Extraction
15Lecture 15: Collaborative Filtering
16Lecture 16: Python Exercise on kNN and PCA
17Lecture 17: Tutorial III
18Lecture 18: Bayesian Learning
19Lecture 19: Naive Bayes
20Lecture 20 : Bayesian Network
21Lecture 21: Python Exercise on Naive Bayes
22Lecture 22: Tutorial IV
23Lecture 23 : Logistic Regression
24Lecture 24: Introduction Support Vector Machine
25Lecture 25: SVM : The Dual Formulation
26Lecture 26: SVM : Maximum Margin with Noise
27Lecture 27: Nonlinear SVM and Kernel Function
28Lecture 28: SVM : Solution to the Dual Problem
29Lecture 29: Python Exercise on SVM
30Lecture 30: Introduction
31Lecture 31: Multilayer Neural Network
32Lecture 32 : Neural Network and Backpropagation Algorithm
33Lecture 33: Deep Neural Network
34Lecture 34: Python Exercise on Neural Network
35Lecture 35: Tutorial VI
36Lecture 36: Introduction to Computational Learning Theory
37Lecture 37: Sample Complexity : Finite Hypothesis Space
38Lecture 38: VC Dimension
39Lecture 39 : Introduction to Ensembles
40Lecture 40 : Bagging and Boosting
41Lecture 41 : Introduction to Clustering
42Lecture 42 : Kmeans Clustering
43Lecture 43: Agglomerative Hierarchical Clustering
44Lecture 44: Python Exercise on kmeans clustering
Sl.No Language Book link
1EnglishNot Available
2BengaliNot Available
3GujaratiNot Available
4HindiNot Available
5KannadaNot Available
6MalayalamNot Available
7MarathiNot Available
8TamilNot Available
9TeluguNot Available

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023:- In This article, we have provided the answers of Introduction to Machine Learning Assignment 3 You must submit your assignment to your own knowledge.

NPTEL Introduction To Machine Learning Week 3 Assignment Answer 2023

1. Which of the following are differences between LDA and Logistic Regression?

  • Logistic Regression is typically suited for binary classification, whereas LDA is directly applicable to multi-class problems
  • Logistic Regression is robust to outliers whereas LDA is sensiti v e to outliers
  • both (a) and (b)
  • None of these

2. We have two classes in our dataset. The two classes have the same mean but different variance.

LDA can classify them perfectly. LDA can NOT classify them perf e ctly. LDA is not applicable in data with these properties Insufficient information

3. We have two classes in our dataset. The two classes have the same variance but different mean.

LDA can classify them perfectly. LDA can NOT classify them perfectly. LDA is not applicable in data with these prop e rties Insufficient information

4. Given the following distribution of data points:

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

What method would you choose to perform Dimensionality Red u ction? Linear Discriminant Analysis Principal Component Analysis Both LDA and/or PCA. None of the above.

5. If log(1−p(x)/1+p(x))=β0+βx Wha t is p(x) ?

p(x)=1+eβ0+βx / eβ0+βx p(x)=1+eβ0+βx / 1−eβ0+βx p(x)=eβ0+βx / 1+eβ0+βx p(x)=1−eβ0+βx / 1+eβ0+βx

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

Red Orange Blue Green

7. Which of these techniques do we use to optimise Logistic Regres s ion:

Least S q uare Error Maximum Likelihood (a) or (b) are equally good (a) and (b) perform very poorly, so we generally avoid using Logistic Regression None of these

8. LDA assumes that the class data is distributed as:

Poisson Unif o rm Gaussian LDA makes no such assumption.

9. Suppose we have two variables, X and Y (the dependent variable), and we wish to find their relation. An expert tells us that relation between the two has the form Y=meX+c. Suppose the samples of the variables X and Y are available to us. Is it possible to apply linear regression to this data to estimate the values of m and c ?

No. Yes. Insufficient information. None of the above.

10. What might happen to our logistic regression model if the number of features is more th a n the number of samples in our dataset?

It will remain unaffected It will not find a hyperplane as the decision bound a ry It will over fit None of the above

NPTEL Introduction to Machine Learning Assignment 3 Answers [July 2022]

1. For linear classification we use: a. A linear function to separate the classes. b . A linear function to model the data. c. A linear loss. d. Non-linear function to fit the data.

2. Logit transformation for Pr(X=1) for given data is S=[0,1,1 , 0,1,0,1] a. 3/4 b. 4/3 c. 4/7 d . 3/7

Answers will be Uploaded Shortly and it will be Notified on Telegram, So  JOIN NOW

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

3. The output of binary class logistic regression lies in this range. a. [−∞,∞] b. [−1,1] c. [0,1] d. [−∞ , 0]

4. If log(1−p(x)1+p(x))=β0+βxlog What is p(x)p(x)?

5. Logistic regression is robust to outliers. Why? a . The squashing of output values between [0, 1] dampens the affect of outliers. b. Linear models are robust to outliers. c. The parameters in logistic regression tend to take small values due to the nature of the problem setting and hence outliers get translated to the same range as other samples. d. The given statement is false.

6. Aim of LDA is (multiple options may apply) a. Minimize intra-class variability. b. Maximize intra-class variability. c . Minimize the distance between the mean of classes d. Maximize the distance between the mean of classes

👇 For Week 04 Assignment Answers 👇

7. We have two classes in our dataset with mean 0 and 1, and variance 2 and 3. a. LDA may be able to classify them perfectly. b. LDA will definitely be able to classify them perfectly. c. LDA will definitely NOT be able to classify them perfectly . d. None of the above.

8. We have two classes in our dataset with mean 0 and 5 , and variance 1 and 2. a. LDA may be able to classify them perfectly. b. LDA will definitely be able to classify them perfectly. c. LDA will definitely NOT be able to classify them perfectly. d. None of the above.

9. For the two classes ’+ ’ and ’-’ shown below. While performing LDA on it, which line is the most appropriate for projecting data points? a. Red b. Orange c. Blue d. Green

10. LDA assumes that the class data is distributed as: a. Poisson b. Uniform c. Gaussian d. LDA makes no such assumption .

For More NPTEL Answers:-  CLICK HERE Join Our Telegram:-  CLICK HERE

Assignment 1
Assignment 2
Assignment 3
Assignment 4
Assignment 5
Assignment 6
Assignment 7
Assignment 8
Assignment 9
Assignment 10
Assignment 11NA
Assignment 12NA

What is Introduction to Machine Learning?

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

CRITERIA TO GET A CERTIFICATE

Average assignment score = 25% of the average of best 8 assignments out of the total 12 assignments given in the course. Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF THE AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

Assignment 1
Assignment 2
Assignment 3
Assignment 4
Assignment 5
Assignment 6 NA
Assignment 7 NA
Assignment 8 NA
Assignment 9NA
Assignment 10NA
Assignment 11NA
Assignment 12NA

NPTEL Introduction to Machine Learning Assignment 3 Answers [Jan 2022]

Q1. consider the case where two classes follow Gaussian distribution which are centered at (6, 8) and (−6, −4) and have identity covariance matrix. Which of the following is the separating decision boundary using LDA assuming the priors to be equal?

(A) x+y=2 (B) y−x=2 (C) x=y (D) both (a) and (b) (E) None of the above (F) Can not be found from the given information

Answer:- (A) x+y=2

👇 FOR NEXT WEEK ASSIGNMENT ANSWERS 👇

Q2. Which of the following are differences between PCR and LDA?

(A) PCR is unsupervised whereas LDA is supervised (B) PCR maximizes the variance in the data whereas LDA maximizes the separation between the classes (C) both (a) and (b) (D) None of these

Answer:- (A) PCR is unsupervised whereas LDA is supervised

Q3. Which of the following are differences between LDA and Logistic Regression?

(A) Logistic Regression is typically suited for binary classification, whereas LDA is directly applicable to multi-class problems (B) Logistic Regression is robust to outliers whereas LDA is sensitive to outliers (C) both (a) and (b) (D) None of these

Answer:- (C) both (a) and (b)

ALSO READ :- NPTEL Registration Steps [July – Dec 2022] NPTEL Exam Pattern Tips & Top Tricks [2022] NPTEL Exam Result 2022 | NPTEL Swayam Result Download

Q4. We have two classes in our dataset. The two classes have the same mean but different variance.

  • LDA can classify them perfectly.
  • LDA can NOT classify them perfectly.
  • LDA is not applicable in data with these properties
  • Insufficient information

Answer:- 2. LDA can NOT classify them perfectly.

Q5. We have two classes in our dataset. The two classes have the same variance but different mean.

Answer:- 1. LDA can classify them perfectly.

Q6. Which of these techniques do we use to optimise Logistic Regression:

  • Least Square Error
  • Maximum Likelihood
  • (a) or (b) are equally good
  • (a) and (b) perform very poorly, so we generally avoid using Logistic Regression

Answer:- 2.Maximum Likelihood

Q7. Suppose we have two variables, X and Y (the dependent variable), and we wish to find their relation. An expert tells us that relation between the two has the form Y = meX + c . Suppose the samples of the variables X and Y are available to us. Is it possible to apply linear regression to this data to estimate the values of m and c ?

  • insufficient information

Answer:- 2.yes

Q8. What might happen to our logistic regression model if the number of features is more than the number of samples in our dataset?

  • It will remain unaffected
  • It will not find a hyperplane as the decision boundary
  • It will overfit
  • None of the above

Answer:- 3. It will overfit

Q9. Logistic regression also has an application in

  • Regression problems
  • Sensitivity analysis
  • Both (a) and (b)

Answer:- 3. Both (a) and (b)

Q10. Consider the following datasets:

NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

Which of these datasets can you achieve zero training error using Logistic Regression (without any additional feature transformations)?

  • Both the datasets
  • Only on dataset 1
  • Only on dataset 2
  • None of the datasets

Answer:- For Answer Click Here

NPTEL Introduction to Machine Learning Assignment 3 Answers 2022:- In This article, we have provided the answers of Introduction to Machine Learning Assignment 3

Disclaimer :- We do not claim 100% surety of solutions, these solutions are based on our sole expertise, and by using posting these answers we are simply looking to help students as a reference, so we urge do your assignment on your own.

For More NPTEL Answers:-  CLICK HERE

Join Our Telegram:-  CLICK HERE

4 thoughts on “NPTEL Introduction to Machine Learning Assignment 3 Answers 2023”

  • Pingback: NPTEL Introduction To Machine Learning Assignment 4 Answers
  • Pingback: NPTEL Introduction To Machine Learning Assignment 5 Answers
  • Pingback: NPTEL Introduction To Machine Learning Assignment 6 Answers
  • Pingback: NPTEL Introduction To Machine Learning Assignment 7 Answers

Leave a Comment Cancel reply

You must be logged in to post a comment.

Please Enable JavaScript in your Browser to Visit this Site.

Spread the word.

Share the link on social media.

Confirm Password *

Username or email *

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Sorry, you do not have permission to ask a question, You must login to ask a question.

SIKSHAPATH Logo

SIKSHAPATH Latest Articles

Nptel introduction to machine learning assignment answers week 3 2022 iitkgp.

Are you looking for help in Machine Learning NPTEL Week 3 Assignment Answers? So, here in this article, we have provided Machine Learning week 3 Assignment Answer’s hint.

NPTEL Introduction to Machine Learning Assignment Answers Week 3

Q1. Suppose, you have given the following data where x and y are the 2 input variables and Class is the dependent variable.

XYClass
-11
01+
02
1-1
10+
12+
22
23+

Suppose, you want to predict the class of new data point x=1 and y=1 using euclidean distance in 3-NN. To which class the new data point belongs to?

a. + Class b. – Class c. Can’t say d. None of these

Answer: a. + Class

For instant notification of any updates, Join us on telegram .

Q2. Imagine you are dealing with a 10 class classification problem. What is the maximum number of discriminant vectors that can be produced by LDA?

Answer : c. 9

Q3. Fill in the blanks:

K-Nearest Neighbor is a ________,_______ algorithm.

a. Non-parametric, eager

b. Parametric, eager

c. Non-parametric, lazy

d. Parametric, lazy

Answer: c. Non-parametric, lazy

Q4. Which of the following statements is True about the KNN algorithm?

a. KNN algorithm does more computation on test time rather than train time.

b. KNN algorithm does lesser computation on test time rather than train time.

c. KNN algorithm does an equal amount of computation on test time and train time.

d. None of these.

Answer: a. KNN algorithm does more computation on test time rather than train time.

Q5. Which of the following necessitates feature reduction in machine learning?

a. Irrelevant and redundant features

b. Curse of dimensionality

c. Limited computational resources.

d. All of the above

Answer: d. All of the above

Q6. When there is noise in data, which of the following options would improve the performance of the KNN algorithm?

a. Increase the value of k

b. Decrease the value of k

c. Changing value of k will not change the effect of the noise

d. None of these

Answer: a. Increase the value of k

Q7. Find the value of the Pearson’s correlation coefficient of X and Y from the data in the following table.

AGE (X)GLUCOSE (Y)
4399
2165
2579
4275

d.  0.33

Answer: b. 0.68

Q8. Which of the following is false about PCA?

a. PCA is a supervised method

b. It identifies the directions that data have the largest variance

c. Maximum number of principal components <= number of features

d. All principal components are orthogonal to each other

Answer: a. PCA is a supervised method

Q9. In user-based collaborative filtering based recommendation, the items are recommended based on :

a. Similar users

b. Similar items

c. Both of the above

d. None of the above

Answer : a. Similar users

Q10. Identify whether the following statement is true or false? “PCA can be used for projecting and visualizing data in lower dimensions.”

Answer: a. TRUE

(in one click)

Disclaimer: These answers are provided only for the purpose to help students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.

Also Available:

NPTEL Introduction to Machine Learning Assignment Answers Week 2

NPTEL Introduction to Machine Learning Assignment Answers Week 4

Related Posts

Nptel programming in java week 6 assignment answers 2023, nptel cloud computing and distributed systems assignment 6 answers 2023, nptel cloud computing assignment 5 answers 2023.

swayam-logo

Introduction to Machine Learning - IITKGP

--> --> --> --> --> --> --> --> --> --> --> --> --> --> --> --> --> -->

Note: This exam date is subject to change based on seat availability. You can check final exam date on your hall ticket.

Page Visits

Course layout, books and references.

  • Machine Learning. Tom Mitchell. First Edition, McGraw- Hill, 1997.
  • Introduction to Machine Learning Edition 2, by Ethem Alpaydin

Instructor bio

introduction to machine learning nptel assignment 3 answers

Prof. Sudeshna Sarkar

Course certificate.

introduction to machine learning nptel assignment 3 answers

DOWNLOAD APP

introduction to machine learning nptel assignment 3 answers

SWAYAM SUPPORT

Please choose the SWAYAM National Coordinator for support. * :

IMAGES

  1. NPTEL INTRODUCTION TO MACHINE LEARNING WEEK 3 ASSIGNMENT ANSWERS

    introduction to machine learning nptel assignment 3 answers

  2. Introduction To Machine Learning Week 3 Assignment 3 Solution

    introduction to machine learning nptel assignment 3 answers

  3. Introduction to Machine Learning NPTEL Assignment 3 Answers 2022 || Unique Jankari

    introduction to machine learning nptel assignment 3 answers

  4. NPTEL Introduction To Machine Learning

    introduction to machine learning nptel assignment 3 answers

  5. NPTEL Introduction to Machine Learning Assignment 3 Answers Week 3 July 2023

    introduction to machine learning nptel assignment 3 answers

  6. Introduction to Machine Learning

    introduction to machine learning nptel assignment 3 answers

VIDEO

  1. Introduction to Machine Learning || NPTEL week 11 answers 2024 #nptel #machinelearning #skumaredu

  2. Introduction to Machine Learning Week 4

  3. Introduction to Machine Learning Week 11

  4. Assignment -3 || Week -3 || Introduction To Machine Learning- IITKGP || NPTEL 2022 ||

  5. Machine Learning,ML Week 1 Quiz Assignment Solution

  6. NPTEL Introduction to Machine Learning -IITKGP Week 3 Quiz Assignment Solutions /Aug 2023

COMMENTS

  1. Introduction To Machine Learning Week 3 Assignment 3 Solution

    #machinelearning #nptel #swayam #python #ml Introduction To Machine Learning All week Assignment Solution - https://www.youtube.com/playlist?list=PL__28a0xFM...

  2. NPTEL Introduction to Machine Learning Week 3 Assignment Solution

    Welcome to our detailed walkthrough of the "NPTEL Introduction to Machine Learning Week 3 Assignment Solution for August 2024," presented by IIT Madras. This...

  3. NPTEL Introduction to Machine Learning Week 3 Quiz Assignment Solutions

    🔊NPTEL Introduction to Machine Learning Week 3 Quiz Assignment Solutions | Jan 2022 | IIT MadrasWith the increased availability of data from varied sources ...

  4. Introduction to Machine Learning Nptel Week 3 Answers

    These are Introduction to Machine Learning Nptel Week 3 Answers. Q9. Suppose we have two variables, X and Y (the dependent variable), and we wish to find their relation. An expert tells us that relation between the two has the form Y=meX+c. Suppose the samples of the variables X and Y are available to us.

  5. NPTEL Introduction to Machine Learning Assignment 3 Answers 2022

    NPTEL Introduction to Machine Learning Assignment 3 Answers:-. Q1. consider the case where two classes follow Gaussian distribution which are centered at (6, 8) and (−6, −4) and have identity. covariance matrix. Which of the following is the separating decision boundary using LDA assuming the priors to be equal?

  6. Introduction to Machine Learning NPTEL Week 3 Solutions NPTEL 2023

    This set of MCQ (multiple choice questions) focuses on the Introduction to Machine Learning NPTEL Week 3 Solutions NPTEL 2023. With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce ...

  7. Category: Nptel Assignment Answers 2024

    Nptel Assignment Answers 2024. Sorted: Introduction To Industry 4.0 And Industrial Internet Of Things Programming Data Structure And Algorithms Using Python Artificial Intelligence Search Methods For Problem Solving Machine Learning and Deep Learning - Fundamentals and Applications.

  8. NPTEL Introduction To Machine Learning

    NPTEL Introduction to Machine Learning - IITKGP Assignment 3 Answers [July 2022] Q1. Suppose, you have given the following data where x and y are the 2 input variables and Class is the dependent variable. Suppose, you want to predict the class of new data point x=1 and y=1 using euclidean distance in 3-NN.

  9. Introduction To Machine Learning IIT-KGP Week 3 Answers

    Answer: A. Increase the value of k. For answers or latest updates join our telegram channel: Click here to join. These are Introduction To Machine Learning IIT-KGP Week 3 Answers. Q7.Find the value of the Pearson's correlation coefficient of X and Y from the data in the following table. Answer: B. 0.68.

  10. NPTEL Introduction to Machine Learning Week 3 Assignment Answers 2024

    NPTEL Introduction to Machine Learning Week 3 Assignment Answers 2024. 1. For a two-class problem using discriminant functions (δk - discriminant function for class k), where is the separating hyperplane located? Where δ1>δ2. Where δ1<δ2. Where δ1=δ2. Where δ1+δ2=1. Answer :- For Answers Click Here. 2.

  11. GitHub

    By following these steps, you can easily locate and use the assignment answers and solutions for the NPTEL courses provided in this repository. We hope this resource assists you in your studies! List of Courses

  12. Assignment -3 || Week -3 || Introduction To Machine Learning ...

    Here's a full videos Solution of the NPTEL Swayam Introduction To Machine Learning- IITKGP Week 3 Assignment 3 answers.#nptelassignmentsolution #nptel2022 #s...

  13. Introduction to Machine Learning

    NPTEL provides E-learning through online Web and Video courses various streams. Toggle navigation. About us; ... A brief introduction to machine learning: Download Verified; 2: Supervised Learning: Download Verified; 3: Unsupervised Learning: Download Verified; 4: Reinforcement Learning: Download Verified; 5: Probability Basics - 1: Download ...

  14. Introduction to Machine Learning

    In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms. INTENDED AUDIENCE : This is an elective course.

  15. NPTEL :: Computer Science and Engineering

    Week 1. Lecture 01: Introduction. Lecture 02: Different Types of Learning. Lecture 03: Hypothesis Space and Inductive Bias. Lecture 04: Evaluation and Cross-Validation. Tutorial I. Week 2. Lecture 05 : Linear Regression. Lecture 06 : Introduction to Decision Trees.

  16. Introduction To Machine Learning

    ABOUT THE COURSE : This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support ...

  17. Introduction to Machine Learning -IIT Madras Week 3 Assignment Answers

    Introduction to Machine Learning -IIT Madras Week 3 Assignment Answers ||Jan 2024|| NPTEL1. Join telegram Channel -- https://t.me/doubttown 🚀 Welcome to Do...

  18. NPTEL Introduction to Machine Learning Assignment 3 Answers 2023

    NPTEL Introduction To Machine Learning Week 3 Assignment Answer 2023. 1. Which of the following are differences between LDA and Logistic Regression? Logistic Regression is typically suited for binary classification, whereas LDA is directly applicable to multi-class problems.

  19. Machine Learning, ML

    Welcome to our NPTEL course on "Machine Learning, ML." In this video, we delve into the key takeaways and answers from Week 3 of the course.Join us on this j...

  20. Introduction To Machine Learning

    This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and ...

  21. NPTEL Introduction To Machine Learning Assignment Answers Week 3 2022

    NPTEL Introduction to Machine Learning Assignment Answers Week 3. Q1. Suppose, you have given the following data where x and y are the 2 input variables and Class is the dependent variable. Suppose, you want to predict the class of new data point x=1 and y=1 using euclidean distance in 3-NN.

  22. Nptel Introduction to Machine Learning Week 3 Assignment Answers

    I trust my investments with Xtra by MobiKwik which is earning me 12% PA returns. And the cherry on top? I get daily interest & can withdraw anytime. Invest y...

  23. Introduction to Machine Learning

    This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and ...

Course Status : Completed
Course Type : Elective
Duration : 8 weeks
Category :
Credit Points : 2
Undergraduate/Postgraduate
Start Date : 26 Jul 2021
End Date : 17 Sep 2021
Enrollment Ends : 09 Aug 2021
Exam Date : 26 Sep 2021 IST