Menu

Executive Programs

Workshops

Projects

Blogs

Careers

Student Reviews



More

Academic Training

Informative Articles

Find Jobs

We are Hiring!


All Courses

Choose a category

Loading...

All Courses

All Courses

logo

Project 2

In linear regression, the model targets to get the best-fit regression line to predict the value of y based on the given input value (x). While training the model, the model calculates the cost function which measures the Root Mean Squared error between the predicted value (pred) and true value (y). The model targets to…

    Project Details

    Loading...

    Leave a comment

    Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.

    Please  login to add a comment

    Other comments...

    No comments yet!
    Be the first to add a comment

    Read more Projects by Saurabh Ranjan (22)

    Project 1

    Objective:

    In machine learning, When we want to train our ML model we split our entire dataset into training_set and test_set using train_test_split() class present in sklearn. Then we train our model on training_set and test our model on test_set. The problems that we are going to face in this method are: Whenever we change the…

    calendar

    29 Apr 2022 02:38 PM IST

    • FEA
    Read more

    Project 2

    Objective:

      Back Propagation (Gradient computation) The backpropagation learning algorithm can be divided into two phases: propagation and weight update.   Phase 1: PropagationEach propagation involves the following steps: Forward propagation of a training pattern's input through the neural network in order to generate…

    calendar

    29 Apr 2022 02:01 PM IST

      Read more

      Project 10

      Objective:

      import pandas as pd import numpy as np import matplotlib.pyplot as plt import seaborn as sns import plotly.express as px import nltk import warnings warnings.simplefilter(action='ignore') df = pd.read_csv('../input/sms-spam-collection-dataset/spam.csv',usecols=['v1','v2'],encoding='cp1252') df.head(2) df.shape df.info()…

      calendar

      27 Apr 2022 04:10 PM IST

        Read more

        Project 11

        Objective:

        import numpy as np import pandas as pd import pandas as pd df = pd.read_csv("/kaggle/input/creditcardfraud/creditcard.csv") df.head() features = df.drop("Class",axis=1) labels = df.loc[:,['Class']] labels.head() from sklearn.model_selection import train_test_split X_train, X_test, y_train, y_test = train_test_split(features,…

        calendar

        27 Apr 2022 11:43 AM IST

          Read more

          Project 9

          Objective:

          The vanishing gradient problem was discovered by Sepp Hochreiter, a German computer scientist who has had an influential role in the development of recurrent neural networks in deep learning. the vanishing gradient problem is related to deep learning gradient descent algorithms. Recall that a gradient descent algorithm…

          calendar

          27 Apr 2022 11:27 AM IST

            Read more

            Project 8

            Objective:

            import os from collections import defaultdict import numpy as np import pandas as pd import matplotlib.pyplot as plt import seaborn as sns from PIL import Image from tqdm import tqdm_notebook as tqdm root = '../input/vehicle/train/train/' data = [] for category in sorted(os.listdir(root)): for file in sorted(os.listdir(os.path.join(root,…

            calendar

            27 Apr 2022 10:03 AM IST

              Read more

              Project 7

              Objective:

              Padding: Problem with Simple Convolution Layers For a gray scale (n x n) image and (f x f) filter/kernel, the dimensions of the image resulting from a convolution operation is (n – f + 1) x (n – f + 1). For example, for an (8 x 8) image and (3 x 3) filter, the output resulting after convolution operation…

              calendar

              27 Apr 2022 09:49 AM IST

                Read more

                Project 6

                Objective:

                import numpy as np import pandas as pd import tensorflow as tf import matplotlib.pyplot as plt import missingno as msno import seaborn as sns df=pd.read_csv('machine_failure_data.csv') msno.matrix(df) values_list = list() cols_list = list() for col in df.columns: pct_missing = np.mean(df[col].isnull())*100 cols_list.append(col)…

                calendar

                27 Apr 2022 09:16 AM IST

                  Read more

                  Project 3

                  Objective:

                  2. Gradient Descent The work of the gradient descent algorithm is to update the weights in such a way that the loss function is minimum. The updation rule is: Now from the image above, we can calculate the values of Z1, Z2, etc, and all the derivates like in the images below.     Activation functions are used…

                  calendar

                  27 Apr 2022 06:45 AM IST

                    Read more

                    Project 1

                    Objective:

                    2. b) Mini-Batch Gradient Descent: Algorithm-   Let theta = model parameters and max_iters = number of epochs. for itr = 1, 2, 3, …, max_iters:      for mini_batch (X_mini, y_mini): Forward Pass on the batch X_mini: Make predictions on the mini-batch Compute error in predictions (J(theta)) with…

                    calendar

                    26 Apr 2022 01:36 PM IST

                      Read more

                      Project 5

                      Objective:

                      Boosting is an ensemble modeling technique that attempts to build a strong classifier from the number of weak classifiers. It is done by building a model by using weak models in series. Firstly, a model is built from the training data. Then the second model is built which tries to correct the errors present in the…

                      calendar

                      26 Apr 2022 01:25 PM IST

                        Read more

                        Project 4

                        Objective:

                        Let us consider a dataset where we have a value of response y for every feature x:  For generality, we define:x as feature vector, i.e x = [x_1, x_2, …., x_n],y as response vector, i.e y = [y_1, y_2, …., y_n]for n observations (in above example, n=10).A scatter plot of the above dataset…

                        calendar

                        26 Apr 2022 11:31 AM IST

                          Read more

                          Project 2

                          Objective:

                          In linear regression, the model targets to get the best-fit regression line to predict the value of y based on the given input value (x). While training the model, the model calculates the cost function which measures the Root Mean Squared error between the predicted value (pred) and true value (y). The model targets to…

                          calendar

                          26 Apr 2022 08:51 AM IST

                            Read more

                            Project 1

                            Objective:

                            A)Importing the necessary libraries and the raw data import pandas as pd import numpy as np import matplotlib.pyplot as plt df = pd.read_csv('auto.csv') df.head() print(df.dtypes)B)Basic insight from dataprint(df.dtypes)In the above picture, the make is the brand name of the car it is supposed to be an object.But bore…

                            calendar

                            25 Apr 2022 08:35 AM IST

                              Read more

                              Project 1

                              Objective:

                              import pandas as pd import seaborn as sns import matplotlib.pyplot as plt import numpy as np data1=pd.read_csv('Stock_file_1.csv') data2=pd.read_csv('stock_file_2.txt') unique1,counts1=np.unique(data1['date'],return_counts=ture) unique2,counts2=np.unique(data2['date'],return_counts=ture) data_merged=pd.concat([data1,data2],axis=0,join='inner',ignore_index=true)…

                              calendar

                              20 Apr 2022 02:39 PM IST

                                Read more

                                Unsupervised Learning - Kmeans Week 11 Challenge

                                Objective:

                                1. We can classify measures in several ways, based on: (i) the manner in which they fill the entries of the similarity matrix, (ii) whether more weight is a function of the frequency of the attribute values, (iii) the arguments used to propose the measure (probabilistic,information-theoretic, etc.).  we will describe…

                                calendar

                                14 Apr 2022 07:05 AM IST

                                  Read more

                                  Supervised Learning - Classification Week 8 Challenge

                                  Objective:

                                  1. import pandas as pd import numpy as np import matplotlib.pyplot as plt import seaborn as sns from sklearn.model_selection import train_test_split from sklearn.neighbors import KNeighborsClassifier from sklearn.metrics import accuracy_score from sklearn.metrics import plot_confusion_matrix #Dataframe df df= pd.read_csv('fault.csv')…

                                  calendar

                                  11 Apr 2022 12:56 PM IST

                                    Read more

                                    Supervised Learning - Classification Week 7 Challenge

                                    Objective:

                                    1. Advantages of support vector machine :   Support vector machine works comparably well when there is an understandable margin of dissociation between classes. It is more productive in high-dimensional spaces. It is effective in instances where the number of dimensions is larger than the number of specimens. Support…

                                    calendar

                                    11 Apr 2022 11:48 AM IST

                                      Read more

                                      Supervised Learning - Classification Week 9 Challenge

                                      Objective:

                                      1. Neural networks, also known as artificial neural networks (ANNs) or simulated neural networks (SNNs), are a subset of  Machine Learning and are at the heart of deep learning algorithms. Their name and structure are inspired by the human brain, mimicking the way that biological neurons signal to one another. Artificial…

                                      calendar

                                      09 Apr 2022 01:41 PM IST

                                        Read more

                                        Basics of Probability and Statistics Week 1 Challenge

                                        Objective:

                                        1. The main difference between population variance and sample variance relates to calculation of variance. Variance is calculated in five steps. First mean is calculated, then we calculate deviations from the mean, and thirdly the deviations are squared, fourthly the squared deviations are summed up and finally this…

                                        calendar

                                        09 Apr 2022 10:43 AM IST

                                          Read more

                                          Supervised Learning - Prediction Week 3 Challenge

                                          Objective:

                                          1. Gradient descent is an optimization technique that can find the minimum of an objective function. It is a greedy technique that finds the optimal solution by taking a step in the direction of the maximum rate of decrease of the function. import numpy as np import matplotlib.pyplot as plt def mean_squared_error(y_true,…

                                          calendar

                                          09 Apr 2022 07:34 AM IST

                                            Read more

                                            Basics of ML & AL Week 2 Challenge

                                            Objective:

                                            1.   2. If proability of occurance of event ka equal then we can compute with simple mean. but if occurance of event are not given or it mat be more than one times the in this situation considered as the expected mean overcome the simple mean.       3. In the Real life distributions are commonly skewed.…

                                            calendar

                                            08 Apr 2022 07:21 AM IST

                                              Read more
                                              Showing 1 of 22 projects