Saransh Agarwal
Think Big
Skills Acquired at Skill-Lync :
Introduction
8 Projects
Basics of Probability and Statistics Week 1 Challenge
Ans 1 Population variance refers to the value of variance that is calculated from population data, and sample variance is the variance calculated from sample data. Due to this value of denominator in the formula for variance in case of sample data is 'n-1', and it is 'n' for population data. Ans 2 In…
02 May 2022 06:22 AM IST
Supervised Learning - Prediction Week 3 Challenge
Ans 1 in the attachment Ans 2 L1 tends to shrink coefficients to zero whereas L2 tends to shrink coefficients evenly. L1 is therefore useful for feature selection, as we can drop any variables associated with coefficients that go to zero. L2, on the other hand, is useful when you have collinear/codependent features.…
03 May 2022 07:36 AM IST
Basics of ML & AL Week 2 Challenge
Week 2 Challenge Ans 1 First businness moment = 1.4 Second Business Moment = 2.326 Third Business Moment = 0.4086 Fourth Business Moment = -30.7919 Ans 2 To calculate mean we sum up all the values and divide the sum with the count of values. This also can be said like , we…
13 May 2022 06:41 PM IST
Supervised Learning - Classification Week 9 Challenge
Ans 1 A neural network is a series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates. In this sense, neural networks refer to systems of neurons, either organic or artificial in nature. Neural networks can adapt to changing…
21 May 2022 12:43 PM IST
Supervised Learning - Classification Week 7 Challenge
Ans 1 Pros: It works really well with a clear margin of separation It is effective in high dimensional spaces. It is effective in cases where the number of dimensions is greater than the number of samples. It uses a subset of training points in the decision function (called support vectors), so it is also memory efficient.…
21 May 2022 01:08 PM IST
Supervised Learning - Classification Week 8 Challenge
Ans 1 in Attachment Ans 2 Pros K-NN is pretty intuitive and simple: K-NN algorithm is very simple to understand and equally easy to implement. To classify the new data point K-NN algorithm reads through whole dataset to find out K nearest neighbors. K-NN has no assumptions: K-NN is a non-parametric algorithm which…
21 May 2022 01:08 PM IST
Unsupervised Learning - Kmeans Week 11 Challenge
Ans 1 The simplest way to find similarity between two categorical attributes is to assign a similarity of 1 if the values are identical and a similarity of 0 if the values are not identical.
26 May 2022 08:01 AM IST
Project 1
.
17 Jun 2022 01:38 PM IST
4 Course Certificates
Academic Qualification
BBA
Lovely Professional University
24 Jul 2019 - 29 Apr 2022
12th
Shivpuri Public School
18 Apr 2018 - 21 Mar 2019
10th
Shivpuri Public School
21 Apr 2016 - 15 Mar 2017
Here are the courses that I have enrolled
40 Hours of Content
Similar Profiles
Ladder of success cannot be climbed with hands in pocket.
The Future in Motion
Give more than what you get you will get more than what you gave
Avid learner