1. Homepage
  2. Programming
  3. CS615 Deep Learning Assignment 1 - Forward Propagation Spring 2024

CS615 Deep Learning Assignment 1 - Forward Propagation Spring 2024

Engage in a Conversation
DrexelCS615Deep LearningPythonForward Propagation

Introduction CourseNana.COM

CS 615 - Deep Learning CourseNana.COM

Assignment 1 - Forward Propagation Spring 2024 CourseNana.COM

In this first assignment we’ll begin designing and implementing the modules we’ll use in our deep learning architectures. In addition, we’ll start getting used to importing datasets. CourseNana.COM

Allowable Libraries/Functions CourseNana.COM

Recall that you cannot use any ML functions to do the training or evaluation for you. Using basic statistical and linear algebra function like mean, std, cov etc.. is fine, but using ones like train are not. Using any ML-related functions, may result in a zero for the programming component. In general, use the “spirit of the assignment” (where we’re implementing things from scratch) as your guide, but if you want clarification on if can use a particular function, DM the professor on discord. CourseNana.COM

Grading CourseNana.COM

Theory
Testing layers independently
Test with dataset on connected layer set
TOTAL CourseNana.COM

Table 1: Grading Rubric CourseNana.COM

20pts 50pts 30pts 100pts CourseNana.COM

Datasets
Kid Creative We will use this dataset for binary classification. This dataset consists of data for CourseNana.COM

673 people in a CSV file. This data for each person includes: 1. Observation Number (we’ll want to omit this)
2. Buy (binary target value,
Y )
3. Income
CourseNana.COM

4. Is Female
5. Is Married
6. Has College
7. Is Professional 8. Is Retired
9. Unemployed
CourseNana.COM

10. Residence Length 11. Dual Income
12. Minors
13. Own
CourseNana.COM

14. House
15. White
16. English
17. Prev Child Mag 18. Prev Parent Mag
CourseNana.COM

We’ll omit the first column and use the second column for our binary target Y . The remaining 16 columns provide our feature data for our observation matrix X. CourseNana.COM

1 Theory
1. Acknowledge teach of the following by typing your name after each of the following statements: CourseNana.COM

  1. (a)  While you can use online resources, you may not copy them. (1pt) Yixu Xia CourseNana.COM

  2. (b)  You cannot place any photos in your report (1pt) Yixu Xia CourseNana.COM

  3. (c)  You cannot place any code snippets in your report (1pt). Yixu Xia CourseNana.COM

  4. (d)  You cannot use any ML frameworks in your code, accept for some purposes explicitly mentioned (1pt).
    Yixu Xia
    CourseNana.COM

  5. (e)  While you may work in Jupyter notebook, or the like, you must, export your code as a python file for submission (and check that it works). (1pt)
    Yixu Xia
    CourseNana.COM

2. Given a single input observation x = 1 2 3 and a fully connected layer with weights of 1 2 CourseNana.COM

W = 3 4as biases b = 1 2, what are the output of the fully connected layer given x 56 CourseNana.COM

as its input (5pts)? CourseNana.COM

3. Given an input, h = [10, 1], what would be the output(s) if this data was processed by the following activation functions/layers (10pts)? CourseNana.COM

(a) Linear (b) ReLU CourseNana.COM

(c) Sigmoid
(d) Hyperbolic Tangent
CourseNana.COM

(e) Softmax CourseNana.COM

1.1 answer CourseNana.COM

2.y = [21, 30] CourseNana.COM

3.(a).y = ax, y = [10, 1], whena = 1 CourseNana.COM

(b)y = max(0, h), y = [10, 0] CourseNana.COM

(c)y = 1 , y = [0.999, 0.269] 1+eh CourseNana.COM

(d)y = tanh h, y = [1, 0.762] (e)y = [0.999, 1.67 105] CourseNana.COM

2 Start Implementing Your Layers CourseNana.COM

To organize our framework, we’ll keep each layer’s implementation in its own file and import them as needed in our main script. CourseNana.COM

You have been provided the abstract base class Layer in the file Layer.py. In addition, there is a lecture slide that shows how the importing of the modules is done. Your first task is to implement the following layers (each in their own file, with the filename being the same as the layer’s name). CourseNana.COM

We are going to have a script that looks to import each module, and run their forward method to see that it generates the proper output. We will do this with both a single observation, and a batch. Both of these should be input as MATRICES (the single observation test is still a 1 × D matrix). We suggest that you do this testing yourself as well. We’re essentially doing unit testing. CourseNana.COM

InputLayer CourseNana.COM

Implement a class called InputLayer that inherits from your abstract base class. Here’s the class’ public interface: CourseNana.COM

class InputLayer(Layer ):
#Input: dataIn, an NxD matrix #Output : None CourseNana.COM

def init (self ,dataIn): #TODO CourseNana.COM

#Input: dataIn, an NxD matrix #Output : An NxD matrix CourseNana.COM

def forward(self ,dataIn): #TODO CourseNana.COM

#We’ll worry about these later... CourseNana.COM

def gradient(self): pass CourseNana.COM

def backward( self , gradIn ): pass CourseNana.COM

This class’ constructor should take as a parameter an entire training dataset (as an N × D matrix), and initialize two attributes, meanX and stdX to be row vectors of the mean and standard devia- tion, respectively, of the features of your training dataset. For numeric stability, set any standard deviations that are zero to 1 (this will avoid divide-by-zero issue when zscoring). CourseNana.COM

In addition, you must implement the abstract method forward such that it takes a data matrix, X, as a parameter, computes the zscored version of this data using the meanX and stdX attributes, stores the input and output in attributes (for later use), and returns the zscored data. CourseNana.COM

Activation Layers CourseNana.COM

Next implement classes for the following activation functions: CourseNana.COM

Linear
ReLU
LogisticSigmoid
Softmax
Tanh (Hyperbolic Tangent Function) CourseNana.COM

The public interface for each should be (where XXX is the name of the activation function, according to the bullets above): CourseNana.COM

class XXXLayer( Layer ) : #Input : None #Output : None CourseNana.COM

def init (self): #TODO CourseNana.COM

#Input: dataIn, an NxK matrix #Output: An NxK matrix CourseNana.COM

def forward(self ,dataIn): #TODO CourseNana.COM

#We’ll worry about these later... CourseNana.COM

def gradient(self): pass CourseNana.COM

def backward( self , gradIn ): pass CourseNana.COM

FullyConnectedLayer CourseNana.COM

Finally, let’s create a class for a fully connected layer, aptly called FullyConnectedLayer. This too should inherit from Layer. Following the material in the lecture slides, this class should have two attributes, a weight matrix and a bias vector. The constructor should take in two explicit parameters, the number of features coming in, and the number of features coming out of this layer, and use these to initialize the weights (and biases, which are technically weights as well) to be random values in the range of ±104. Its forward method once again takes in data X, storing it in its parent’s previous input attribute, and computes the output (storing it with the parent class, and returning it) as: CourseNana.COM

Y = XW + b CourseNana.COM

where W is the weight matrix and b is the bias vector. For now, the gradient method may return nothing. In addition provide getter and setter methods for the weight and bias attributes. Here’s the public interface: CourseNana.COM

class FullyConnectedLayer(Layer ): CourseNana.COM

#Input : #Input : #Output : CourseNana.COM

of data
the data
CourseNana.COM

sizeIn , the number of features sizeOut , the number of features for CourseNana.COM

coming in coming CourseNana.COM

def init (self , sizeIn , sizeOut): #TODO CourseNana.COM

#Input : None
#Output : The sizeIn x
CourseNana.COM

def getWeights( self ): #TODO CourseNana.COM

#Input : The sizeIn x #Output : None CourseNana.COM

sizeOut CourseNana.COM

sizeOut CourseNana.COM

matrix . CourseNana.COM

matrix . CourseNana.COM

def setWeights(self , weights): #TODO CourseNana.COM

#Input : None
#Output: The 1 x sizeOut bias vector
CourseNana.COM

def getBiases(self): #TODO CourseNana.COM

#Input: The 1 x sizeOut bias vector #Output : None CourseNana.COM

def setBiases(self , biases): #TODO CourseNana.COM

#Input : dataIn , an NxD data matrix #Output: An NxK data matrix CourseNana.COM

def forward(self ,dataIn): #TODO CourseNana.COM

#We’ll worry about these later... CourseNana.COM

def gradient(self): pass CourseNana.COM

def backward( self , gradIn ): pass CourseNana.COM

3 Connecting Layers and Forward Propagate CourseNana.COM

Now let’s assemble a simple network and forward propagate data through it! With real data! Our architecture will be: CourseNana.COM

InputFC (1 output)Logistic Sigmoid CourseNana.COM

We’ll use the dataset mentioned earlier in the assignment. Read in the dataset as your input data X and pass it through the architecture mentioned above. From an implementation standpoint, you’ll likely want to create instances of your classes and organize them in some sort of an ordered structure such that the output of one layer is the input of the next. See the example code at the end of the Building Blocks slides. CourseNana.COM

In your report, just provide the output of the last layer pertaining to the first observation. CourseNana.COM

3.1 output CourseNana.COM

[0.50006194] CourseNana.COM

CourseNana.COM

Submission CourseNana.COM

For your submission, upload to Blackboard a single zip file containing: CourseNana.COM

1. PDF Writeup 2. Source Code 3. readme.txt file CourseNana.COM

The readme.txt file should contain information on how to run your code to reproduce results for each part of the assignment. CourseNana.COM

The PDF document should contain the following: CourseNana.COM

  1. Part 1: Your solutions to the theory question. DO NOT put screen shots or code in this. CourseNana.COM

    You should be typesetting your numeric results. CourseNana.COM

  2. Part 2: Nothing CourseNana.COM

  3. Part 3: The output pertaining to the first observation from the final layer, when given the augemented medical cost dataset as its input. CourseNana.COM

Get in Touch with Our Experts

WeChat WeChat
Whatsapp WhatsApp
Drexel代写,CS615代写,Deep Learning代写,Python代写,Forward Propagation代写,Drexel代编,CS615代编,Deep Learning代编,Python代编,Forward Propagation代编,Drexel代考,CS615代考,Deep Learning代考,Python代考,Forward Propagation代考,Drexelhelp,CS615help,Deep Learninghelp,Pythonhelp,Forward Propagationhelp,Drexel作业代写,CS615作业代写,Deep Learning作业代写,Python作业代写,Forward Propagation作业代写,Drexel编程代写,CS615编程代写,Deep Learning编程代写,Python编程代写,Forward Propagation编程代写,Drexelprogramming help,CS615programming help,Deep Learningprogramming help,Pythonprogramming help,Forward Propagationprogramming help,Drexelassignment help,CS615assignment help,Deep Learningassignment help,Pythonassignment help,Forward Propagationassignment help,Drexelsolution,CS615solution,Deep Learningsolution,Pythonsolution,Forward Propagationsolution,