AKSHAT RAO PATIL

Campus Ambassador at viden.io

Studied at St. Stephens senior secondary school

UPSC Mains Syallabus 2019 document

This document contains the syllabus for UPSC Mains. It also contains the pattern for the main examination.

Analog Communication - DSBSC Modulation

In the process of Amplitude Modulation, the modulated wave consists of the carrier wave and two sidebands. The modulated wave has the information only in the sidebands. Sideband is nothing but a band of frequencies, containing power, which are the lower and higher frequencies of the carrier frequency.

Analog Communication DSBSC Demodulators

The process of extracting an original message signal from DSBSC wave is known as detection or demodulation of DSBSC. The following demodulators (detectors) are used for demodulating DSBSC wave.

Decision Tree Induction

Decision tree induction is the learning of decision trees from class-labelled training examples. A decision-tree is a flow-chart like tree structure, where each internal node denotes a test on an attribute, each branch represents an outcome of the test and each leaf node holds a class label. ID3, C4.5 and CART adopt a greedy(nonbacktracking) approach in which decision trees are constructed in a top-down recursive divide-and-conquer manner.

Genetic Algorithms

Genetic Algorithm is a non traditional optimization technique. It mimics the process of evolution, and hence is called evolutionary technique. It is search algorithms based on the mechanics of natural selection and natural genetics. Based on the “survival of the fittest” concept (Darwinian Theory) eg. Average life expectancy of an Indian is 70 – 80 years. Simulates the process of evolution. (Evolution is an optimization process)

SUPPORT VECTOR MACHINES

Support Vector Machine (SVM) was first heard in 1992, introduced by Boser, Guyon, and Vapnik in COLT-92 (Pittsburgh). Set of related supervised learning methods used for classification and regression. Classification and regression prediction tool that uses machine learning theory to maximize predictive accuracy while automatically avoiding over-fit to the data They belong to a family of generalized linear classifiers Became popular after the technique gave accuracy comparable to sophisticated neural networks with elaborated features in a handwriting recognition task It is also being used for many applications, such as hand writing analysis, face analysis and so forth, especially for pattern classification and regression based applications

Regression Models in Machine Learning

Regression model estimates the nature of the relationship between the independent and dependent variables. Change in dependent variables that results from changes in independent variables, ie. size of the relationship. Strength of the relationship. Statistical significance of the relationship.

Machine Learning

This is a question paper for Nirma University students. This paper was for 6th-semester students and you can get the references from here.

Data Mining and basics of Artificial Intelligence

This is a 2017 question paper of Nirma University for Data Mining.

Introduction to Arbitration

Arbitration is emerging as the first-choice method of binding dispute resolution in the widest range of international commercial contracts. It is a private process requiring the agreement of the parties, which is usually given by way of an arbitration clause in the contract. If there is no contractual provision to arbitrate, a separate arbitration agreement may be entered into once a dispute has arisen.

Overview of Database Design--the ER Model

The initial phase of database design is to characterize fully the data needs of the prospective database users. Next, the designer chooses a data model and, by applying the concepts of the chosen data model, translates these requirements into a conceptual schema of the database. A fully developed conceptual schema also indicates the functional requirements of the enterprise. In a “specification of functional requirements”, users describe the kinds of operations (or transactions) that will be performed on the data. The ER data model was developed to facilitate database design by allowing specification of an enterprise schema that represents the overall logical structure of a database.

Query Processing

Query Processing is a translation of high-level queries into low-level expression. It is a step wise process that can be used at the physical level of the file system, query optimization and actual execution of the query to get the result. It requires the basic concepts of relational algebra and file structure.