### Real-Life Example: Customer Purchase Prediction

**Define the Problem**:- We want to predict whether a customer will make a purchase (
`0`

for no,`1`

for yes) based on three features:**Browsing Time**: Time spent on the website.**Number of Pages Viewed**: Pages viewed during the visit.**Previous Purchase History**: Whether the customer has made a previous purchase (`0`

for no,`1`

for yes).

- We want to predict whether a customer will make a purchase (
**Collect and Prepare Data**:- Let’s create a simple dataset for illustration.

### Dataset

**Inputs (**:`X`

)- Each row represents a customer.
- Each column represents a feature.

```
import numpy as np
# Input features: [Browsing Time, Number of Pages Viewed, Previous Purchase History]
X = np.array([[5, 15, 0],
[20, 25, 1],
[10, 20, 0],
[25, 30, 1]])
```

**Outputs ( y)**:

- Each row represents the target output (purchase or not).

```
# Output labels: [Purchase (1) or Not (0)]
y = np.array([[0],
[1],
[0],
[1]])
```

### Step-by-Step Implementation

#### Step 1: Define the Neural Network Structure

```
input_size = X.shape[1] # Number of input features (3)
hidden_size = 4 # Number of neurons in the hidden layer
output_size = y.shape[1] # Number of output neurons (1)
```

Step 2: Initialize Weights and Biases

```
def initialize_parameters(input_size, hidden_size, output_size):
np.random.seed(1) # Seed for reproducibility
W1 = np.random.randn(input_size, hidden_size) * 0.01
b1 = np.zeros((1, hidden_size))
W2 = np.random.randn(hidden_size, output_size) * 0.01
b2 = np.zeros((1, output_size))
return W1, b1, W2, b2
```

Step 3: Define the Activation Function and Its Derivative

```
def sigmoid(x):
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(x):
return x * (1 - x)
```

Step 4: Forward Propagation

```
def forward_propagation(X, W1, b1, W2, b2):
Z1 = np.dot(X, W1) + b1
A1 = sigmoid(Z1)
Z2 = np.dot(A1, W2) + b2
A2 = sigmoid(Z2)
return Z1, A1, Z2, A2
```

Step 5: Backpropagation

```
def backpropagation(X, y, Z1, A1, Z2, A2, W1, b1, W2, b2, learning_rate):
m = X.shape[0]
dZ2 = A2 - y
dW2 = np.dot(A1.T, dZ2) / m
db2 = np.sum(dZ2, axis=0, keepdims=True) / m
dA1 = np.dot(dZ2, W2.T)
dZ1 = dA1 * sigmoid_derivative(A1)
dW1 = np.dot(X.T, dZ1) / m
db1 = np.sum(dZ1, axis=0, keepdims=True) / m
# Update parameters
W1 -= learning_rate * dW1
b1 -= learning_rate * db1
W2 -= learning_rate * dW2
b2 -= learning_rate * db2
return W1, b1, W2, b2
```

Step 6: Training the Network

```
def train(X, y, input_size, hidden_size, output_size, epochs, learning_rate):
W1, b1, W2, b2 = initialize_parameters(input_size, hidden_size, output_size)
for epoch in range(epochs):
# Forward propagation
Z1, A1, Z2, A2 = forward_propagation(X, W1, b1, W2, b2)
# Backpropagation
W1, b1, W2, b2 = backpropagation(X, y, Z1, A1, Z2, A2, W1, b1, W2, b2, learning_rate)
# Optionally print the loss every 100 epochs
if epoch % 100 == 0:
loss = np.mean(np.square(y - A2))
print(f'Epoch {epoch}, Loss: {loss}')
return W1, b1, W2, b2
```

Step 7: Making Predictions

```
def predict(X, W1, b1, W2, b2):
_, _, _, A2 = forward_propagation(X, W1, b1, W2, b2)
return A2
# Example usage
if __name__ == "__main__":
# Define the neural network structure
input_size = X.shape[1]
hidden_size = 4 # Number of neurons in the hidden layer
output_size = y.shape[1]
epochs = 10000
learning_rate = 0.1
# Train the neural network
W1, b1, W2, b2 = train(X, y, input_size, hidden_size, output_size, epochs, learning_rate)
# Make predictions
predictions = predict(X, W1, b1, W2, b2)
print("Final output after training:")
print(predictions)
```

### Step-by-Step Explanation

**Define the Problem**:

- We have 4 examples of customer behavior with 3 features each.We want to predict whether they will make a purchase based on their browsing time, number of pages viewed, and previous purchase history.

**Prepare the Data**:

`X`

represents the input features.`y`

represents the target output (purchase or not).

**Initialize Weights and Biases**:

- Initialize small random values for weights (
`W1`

,`W2`

).Initialize zeros for biases (`b1`

,`b2`

).

**Define Activation Functions**:

- Use sigmoid function for activation.Use its derivative for backpropagation.

**Forward Propagation**:

- Compute intermediate values (
`Z1`

,`A1`

,`Z2`

,`A2`

) using current weights and biases.

**Backpropagation**:

- Calculate gradients of loss with respect to weights and biases.Update weights and biases using these gradients.

**Training**:

- Train the network for a specified number of epochs.Adjust weights and biases iteratively to minimize the loss.

**Making Predictions**:

- Use the trained network to make predictions on new data.

By following these steps, we can implement a neural network to predict customer purchases based on their behavior. The key steps include data preparation, network initialization, forward propagation, backpropagation, training, and making predictions.