##
**Make Your Own Neural Network in Python**

## Machine learning is one of the fastest growing fields, and we cannot emphasize enough about its importance.

This course aims to teach one of the fundamental concepts of machine learning, i.e., Neural Network.

You will learn the basic concepts of building a model as well as the mathematical explanation behind Neural Network and based on that; you will build one from scratch (in Python).

You will also learn how to train and optimize your network to achieve a better result.

We have specifically designed this course for beginners and it does not require any prior programming experience.

**Contents**

**1. Prologue**

- The Search for Intelligent Machines
- A Nature Inspired New Golden Age

**2. Introduction**

- Who is this course for?
- What will we do?
- How will we do it?
- Author’s Note

**3. Part 1 - A Little Background**

- Easy for Me, Hard for You
- A Simple Predicting Machine
- Estimating the Constant "c" Iteratively
- Classifying vs. Predicting
- Building a Simple Classifier
- Error in the Training Classifier
- Refining the Parameters of Training Classifier
- Setting up Learning Rate in Training Classifier
- Limitations of Linear Classifiers
- Representing Boolean Functions with Linear Classification

**4. Part 2 - Let's Get Started!**

- Neurons, Nature’s Computing Machines
- How Neurons Really Work?
- What is an Activation Function?
- Replicating Neuron to an Artificial Model
- Following Signals Through A Simpler Network
- Calculating Neural Network Output
- Matrix Multiplication is Useful .. Honest!
- Calculating Inputs for Internal Layers
- A Three Layer Example: Working on Input Layer
- A Three Layer Example: Working on Hidden Layer
- A Three Layer Example: Working on Output Layer

**5. Part 3 - Backward Propagation of Error**

- Learning Weights From More Than One Node
- Backpropagating Errors From More Output Nodes
- Backpropagation: Splitting the Error
- Backpropagation: Recombining the Error
- Backpropagating Errors with Matrix Multiplication

**6. Part 4 - Adjusting the Link Weights**

- How Do We Actually Update Weights?
- Embrace Pessimism
- Understanding the Gradient Descent Algorithm
- How to Transform the Output into Error Function?
- Using Gradient Descent to Update Weights
- Choosing the Right Weights...Iteratively!
- One Last Thing...
- Weight Update Worked Example
- Preparing Data: Inputs & Outputs
- Preparing Data: Random Initial Weights

**7. Part 5 - A Gentle Start with Python**

- Getting Started
- Loops
- Functions
- Arrays
- Plotting Arrays
- Objects
- Methods

**8. Part 6 - Neural Network with Python**

- Building the Neural Network Class
- Initializing the Network
- Weights - The Heart of the Network
- Optional: More Sophisticated Weights
- Querying the Network
- Applying Sigmoid Function
- The Code Thus Far..
- Testing Our Code Thus Far
- Training the Network
- Refining the Weights
- The Complete Neural Network Code

**9. Part 7 - Testing Neural Network against MNIST Dataset**

- The MNIST Dataset of Handwritten Numbers
- A Quick Look at the Data Files
- Getting the Dataset Ready
- Plotting the Data Points
- Preparing the MNIST Training Data
- The Need to Rescale the Target Output
- Python Code to Create and Rescale the Output Array
- Updating Neural Network Code
- Testing the Network on a Subset
- Testing the Network Against the Whole Dataset!
- Updating the Neural Network Code...Again

**10. Part 8 - Some Suggested Improvements**

- Tweaking the Learning Rate
- Doing Multiple Runs
- Change Network Shape

**11. Part 9 - Even More Fun!**

- Your Own Handwriting
- Inside the Mind of a Neural Network
- Backward Query
- More Brain Scans
- Creating New Training Data by Rotations

**12. Epilogue**

- Epilogue

**13. Appendix: A Small Guide to Calculus**

- A Gentle Introduction
- A Flat Line
- A Sloped Straight Line
- A Curved Line
- Calculus By Hand
- Calculus Not By Hand
- Calculus without Plotting Graphs
- Patterns
- Functions of Functions
- Handling Independent Variables

Share This :

comment0 Commentsmore_vert