↓ Skip to Main Content

Arshad Kazi

Journey of Curiosity

Main Navigation

  • Home
  • Machine Learning
  • Random Thoughts

Tag: Activation functions

ReLU, Sigmoid & Tanh Activation Functions

Why do we use ReLU activation function over sigmoid or tanh? Why activation functions suffer from vanishing gradient problem?

Read more ›

Tagged with Activation functions, CNN, Deep learning, ReLU Posted on March 3, 2021

Activation Function & its Non-Linearity

What is an activation function?
What is linearity or non-linearity?
Why do we use a non-linear activation function?

Read more ›

Tagged with Activation functions, Convolutional Neural Networks, Deep learning, Machine Learning, Non-linearity Posted on February 16, 2021

A* algorithm Activation functions ADHD AI anxiety management autism BERT biography Books Branch and bound Breadth First Search child psychology CNN Computer Vision Convolutional Neural Networks cross validation dating Deep learning depression Depth First Search Face Recognition first salary Giftedness hookups hope hopelessness Ikigai life Loneliness love Machine Learning Machine learning projets marriages Mathematics Mathematics behind ML algorithms meditation mental health Mindfullness Mindfulness mini blog ML Basics MLOps modern relationships Motivation Motivation for exercise My Grandfather natural language processing neurodivergence neuroscience NLP Non-linearity Philosophy purpose ReLU self learning Sign language recognition Skeleton Tracking statistics Support Vector Machine When Breath Becomes Air

Arshad Kazi | Powered by Responsive Theme