Syllabus B Tech Computer Science Eighth Semester Machine Learning CS8003

Computer-Science-Engineering-8

Syllabus B Tech Computer Science Eighth Semester Machine Learning CS8003

The concepts developed in this course will aid in quantification of several concepts in Computer Science Engineering that have been introduced at the Engineering courses. Technology is being increasingly based on the latest Syllabus B Tech Computer Science Eighth Semester Machine Learning CS8003 is given here.

The objective of this course Syllabus B Tech Computer Science Eighth Semester Machine Learning CS8003 is to develop ability and gain insight into the process of problem-solving, with emphasis on thermodynamics. Specially in following manner: Apply conservation principles (mass and energy) to evaluate the performance of simple engineering systems and cycles. Evaluate thermodynamic properties of simple homogeneous substances. Analyze processes and cycles using the second law of thermodynamics to determine maximum efficiency and performance. Discuss the physical relevance of the numerical values for the solutions to specific engineering problems and the physical relevance of the problems in general and Critically evaluate the validity of the numerical solutions for specific engineering problems. More precisely, the objectives are:

  • To enable young technocrats to acquire mathematical knowledge to understand Laplace transformation, Inverse Laplace transformation and Fourier Transform which are used in various branches of engineering.
  • To introduce effective mathematical tools for the Numerical Solutions algebraic and transcendental equations.
  • To acquaint the student with mathematical tools available in Statistics needed in various field of science and engineering.

CS 8003 – Machine Learning

Unit 1
INTRODUCTION Machine learning basics: What is Machine Learning, Types and Applications of ML, , Tools used, AI vs ML .Introduction to Neural Networks. Introduction to linear regression: SSE; gradient descent; closed form; normal equations; features, Introduction to classification: Classification problems; decision boundaries; nearest neighbor methods. Linear regression; SSE; gradient descent; closed form; normal equations; features Overfitting and complexity; training, validation, test data, and introduction to Matlab (II).
Unit 2
SUPERVISED LEARNING: Introduction to Supervised Learning, Supervised learning setup, LMS, Linear Methods for Classification, Linear Methods for Regression, Support Vector Machines. Basis Expansions, Model Selection Procedures Perceptron, Exponential family, Generative learning algorithms, Gaussian discriminant analysis, Naive Bayes, Support vector machines, Model selection and feature selection, Decision Tree, Ensemble methods: Bagging, boosting, Evaluating and debugging learning algorithms. Classification problems; decision boundaries; nearest neighbor methods, Probability and classification, Bayes optimal decisions Naive Bayes and Gaussian class conditional distribution, Linear classifiers Bayes’ Rule and Naive Bayes Model, Logistic regression, online gradient descent, Neural Networks Decision tree and Review for Mid-term, Ensemble methods: Bagging, random forests, boosting A more detailed discussion on Decision Tree and Boosting.
Unit 3
REINFORCEMENT LEARNING: Markov decision process (MDP), HMM, Bellman equations, Value iteration and policy iteration, Linear quadratic regulation, Linear Quadratic Gaussian, Q-learning, Value function approximation, Policy search, Reinforce, POMDPs.
Unit 4
UNSUPERVISED LEARNING: Introduction to Unsupervised Learning : Association Rules, Cluster Analysis, Reinforcement Learning,Clustering K-means, EM. Mixture of Gaussians, Factor analysis, PCA (Principal components analysis), ICA (Independent components analysis);, hierarchical agglomeration Advanced discussion on clustering and EM, Latent space methods; PCA, Text representations; naive Bayes and multinomial models; clustering and latent space models, VC-dimension, structural risk minimization; margin methods and support vector machines (SVM), Support vector machines and large-margin classifiers Time series; Markov models; autoregressive models.
Unit 5
DIMENSIONALITY REDUCTION: Feature Extraction , Singular value decomposition. Feature selection – feature ranking and subset selection, filter, wrapper and embedded methods. Machine Learning for Big data: Big Data and MapReduce, Introduction to Real World ML, Choosing an Algorithm, Design and Analysis of ML Experiments, Common Software for ML.

Books Recommended

1. Tom M. Mitchell, ―Machine Learning, McGraw-Hill Education (India) Private Limited, 2013.
2. Ethem Alpaydin, ―Introduction to Machine Learning (Adaptive Computation and Machine Learning) The MIT Press 2004.
, 3. Stephen Marsland, ―Machine Learning: An Algorithmic Perspective, CRC Press, 2009.