Week 2 (15-19 july)
Day1
In today's udemy course we learned about following topics:
Day2
Today, firstly we discussed about the milestone project and sorted out the problems related to it.After that we continued with our python course and learnt about object oriented programming in python which includes following topics:
Today firstly we were asked to implement the logistic regression from scratch using numpy.The code for the same is shown below:
import numpy as np
import math
sample=1000
features=3
x_data=np.random.rand(sample,features)
x_ones=np.ones((sample,1))
x_initial=np.append(x_ones,x_data,axis=1)
beta=np.random.rand(features+1,1)
noise=np.random.randn(sample,1)
#y sigmoid
z0=np.dot(x_initial,beta)+noise
e_z0=np.exp(-z0)
y_initial= np.ones((sample,1))
for i in range(len(z0)):
y_initial[i]=(1/(1+e_z0[i]))
y_initial=np.round(y_initial)
beta=np.random.rand(features+1,1)
z=np.dot(x_initial,beta)
For today we only generated the data and found the sigmoid value.
After this we shifted towards python decorators and generators in the python course.
Day5
Today firstly we completed our code of logistic regression which is as follows:
beta=np.random.rand(features+1,1)
for i in range(1000):
y_pred=np.ones((sample,1)) #it is same as p or probability or sigmoid
e_z=np.exp(-z)
for i in range(len(e_z)):
y_pred[i]=1/(1+e_z[i])
derivative=(np.dot(np.transpose(x_ones),y_initial-y_pred))/sample
beta=beta+0.01*derivative
cost=-(y_initial*np.log(y_pred)+(x_ones-y_initial)*np.log(x_ones-y_pred))
z=np.dot(x_initial,beta)
print(np.mean(cost))
After this we decided to do milestone project-2 which was to develop a blackjack game in python and discussed algorithm for the same.
In today's udemy course we learned about following topics:
- Methods and functions
- *args and **kwargs
- lambda expressions , maps and filters
Day2
Today, firstly we discussed about the milestone project and sorted out the problems related to it.After that we continued with our python course and learnt about object oriented programming in python which includes following topics:
- Class object attributes and methods
- Inheritance and polymorphism
- Special(Magic/Dunder) Methods
Day3
Today we started with the scikit-learn
implementation of linear regression model and compared the results of
this model with previously implemented code of linear regression from
scratch on a sample data set given by Vikram sir.
Later we started with modules and
packages (we need them to run different - different functionality) in
python course.After that we learnt about exception handling in python.
Day4
Today firstly we were asked to implement the logistic regression from scratch using numpy.The code for the same is shown below:
import numpy as np
import math
sample=1000
features=3
x_data=np.random.rand(sample,features)
x_ones=np.ones((sample,1))
x_initial=np.append(x_ones,x_data,axis=1)
beta=np.random.rand(features+1,1)
noise=np.random.randn(sample,1)
#y sigmoid
z0=np.dot(x_initial,beta)+noise
e_z0=np.exp(-z0)
y_initial= np.ones((sample,1))
for i in range(len(z0)):
y_initial[i]=(1/(1+e_z0[i]))
y_initial=np.round(y_initial)
beta=np.random.rand(features+1,1)
z=np.dot(x_initial,beta)
For today we only generated the data and found the sigmoid value.
After this we shifted towards python decorators and generators in the python course.
Day5
Today firstly we completed our code of logistic regression which is as follows:
beta=np.random.rand(features+1,1)
for i in range(1000):
y_pred=np.ones((sample,1)) #it is same as p or probability or sigmoid
e_z=np.exp(-z)
for i in range(len(e_z)):
y_pred[i]=1/(1+e_z[i])
derivative=(np.dot(np.transpose(x_ones),y_initial-y_pred))/sample
beta=beta+0.01*derivative
cost=-(y_initial*np.log(y_pred)+(x_ones-y_initial)*np.log(x_ones-y_pred))
z=np.dot(x_initial,beta)
print(np.mean(cost))
After this we decided to do milestone project-2 which was to develop a blackjack game in python and discussed algorithm for the same.
Comments
Post a Comment