Blog posts

2025

2024

ML4LM - Vanishing Gradient Problem? [medium]

2 minute read

Published:

Ever noticed that while training neural networks, the loss stops decreasing, and weights don’t get updated after a certain point? Understanding this hitch involves looking at how we optimize loss using gradient descent, adjusting weights to find the lowest loss.

2023

ML4LM- How does Lasso bring sparsity? [medium]

2 minute read

Published:

Many of us have heard about Lasso and its ability to bring sparsity to models, but not everyone understands the nitty-gritty of how it actually works. In a nutshell, Lasso is like a superhero for overfitting problems, tackling them through a technique called regularization. If you’re not familiar with regularization and how it fights overfitting, I’d recommend checking that out first. For now, let’s dive into the magic of how Lasso brings sparsity.

ML4LM — What are Derivatives? [medium]

5 minute read

Published:

Back in my school days up to the 10th grade, I had a genuine love for math. Whether it was tackling geometry, diving into trigonometry, or exploring progressions, I felt pretty confident in my abilities. But then came derivatives, and suddenly everything took a sharp turn. Instead of visualizing and understanding the beauty of math, I found myself stuck in a maze of formulas and differentiation problem-solving.

ML4LM-Feature Scaling- Normalization [medium]

3 minute read

Published:

Ever wondered how data gets its makeover before revealing its insights? Enter the battleground of data refinement, where normalization and standardization go head-to-head. Think of it as a compelling tale of two methods, each with its unique charm.

ML4LM— Cleaning the Data [medium]

4 minute read

Published:

Cleaning data for Machine Learning is like preparing for a road trip where your model is the driver, and your data is the map. However, the map is a mishmash of routes, some as straightforward as a highway, while others resemble a convoluted maze that even a GPS would find confusing.