submited
Yogeshwar Manerikar
updated on 02 Mar 2022
Project Details
Leave a comment
Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.
Other comments...
Read more Projects by Yogeshwar Manerikar (13)
Project 2
done
08 Apr 2022 02:49 PM IST
Project 1
I clean the data and given some insights and also done some mean mode for remove null i used one hot for horsepower_binned
07 Apr 2022 07:14 PM IST
Supervised Learning - Classification Week 9 Challenge
Q1. The neuron is the smallest unit and the building block of a neural network. A neuron takes a set of inputs, performs some mathematical computations, and gives an output. The inputs and outputs are numbers, either positive or negative. In this example, the neuron takes two inputs. However, there is no limit to the number…
07 Apr 2022 02:55 PM IST
Unsupervised Learning - Kmeans Week 11 Challenge
Q1.computing similarity between categorical data instances is not straightforward owing to the fact that there is no explicit notion of ordering between categorical values. To overcome this problem, several data-driven similarity measures have been proposed for categorical data. The behavior of such measures directly depends…
07 Apr 2022 01:41 PM IST
Supervised Learning - Classification Week 8 Challenge
high dimensional data may not perform as well as other techniques. KNN can benefit from feature selection that reduces the dimensionality of the input feature space. this is the mAIN key note Pros:- As there is no training period thus new data can be added at any time since it wont affect the model. cons:- Does…
06 Apr 2022 09:39 AM IST
Supervised Learning - Classification Week 7 Challenge
ADVT:- SVM works relatively well when there is a clear margin of separation between classes. more effective in high dimensional spaces becaus kernal tric SVM is effective in cases where the number of dimensions is greater than the number of samples this is used in computer vision for reduced the dimentanality. SVM…
04 Apr 2022 01:41 PM IST
Supervised Learning - Prediction Week 3 Challenge
Q1. in ipython file Q2. The key difference between these techniques is that the less important feature’s coefficient to zero thus, removing some feature altogether. So, this works well for feature selection in case we have a huge number of features. L1 regularization penalizes the sum of absolute values of the weights,…
23 Mar 2022 07:25 AM IST
Basics of ML & AL Week 2 Challenge
Q1. 1st Business Moment: Measure of Central Tendency This moment speaks about the center of the data point and indicates where the majority of data points lie. a.) Mean: The average of all the data points in a data set is called the mean. For Population, μ = (Σ Xi) / N here it is nothaving the same…
21 Mar 2022 05:47 AM IST
Basics of Probability and Statistics Week 1 Challenge
Q1. Sample Variance Population Variance The formula for sample variance is given as ∑ni=1(xi−μ)2n−1 ∑ i = 1 n ( x i − μ ) 2 n − 1 The formula for population variance is equal to ∑ni=1(xi−μ)2n ∑ i = 1 n ( x i − μ ) 2 n Where xi = ith. Number,…
18 Mar 2022 06:31 PM IST
Project 2
Q1 Q2 Q3 Q4 Q5 Q.6 Q7. Q.8 Q.9 Q.10 Q11 Q12. 13 14 15 16
17 Mar 2022 05:03 PM IST
Project 1
submited
07 Mar 2022 07:13 AM IST
Project 1 - English Dictionary App & Library Book Management System
submited
02 Mar 2022 03:34 PM IST
Project 2 - EDA on Vehicle Insurance Customer Data
submited
26 Feb 2022 04:23 PM IST