All
Search
Images
Videos
Maps
News
More
Shopping
Flights
Travel
Notebook
Report an inappropriate content
Please select one of the options below.
Not Relevant
Offensive
Adult
Child Sexual Abuse
Length
All
Short (less than 5 minutes)
Medium (5-20 minutes)
Long (more than 20 minutes)
Date
All
Past 24 hours
Past week
Past month
Past year
Resolution
All
Lower than 360p
360p or higher
480p or higher
720p or higher
1080p or higher
Source
All
Dailymotion
Vimeo
Metacafe
Hulu
VEVO
Myspace
MTV
CBS
Fox
CNN
MSN
Price
All
Free
Paid
Clear filters
SafeSearch:
Moderate
Strict
Moderate (default)
Off
Filter
Artificial Intelligence | AI on Instagram: "Gradient descent is a fundamental optimization algorithm used by most AI models to learn from data by minimizing a loss function, which measures how far the model’s predictions are from the true values. Conceptually, it treats the loss function as a landscape (we call this the loss landscape) with peaks and valleys representing high and low errors. At any point on this landscape, the gradient (vector of slopes) indicates the direction and steepness of
7.6K views
3 weeks ago
Instagram
0:27
MichaeL | Teaching myself ML on Instagram: "Using gradient descent in logistic regression. ⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️⬇️ —-——-——-——-——-——-——-——-— **Study Group Info** Want to master Machine Learning? ➡️Comment “Study” to join my exclusive study group** —-——-——-——-——-——-——-——-— ⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️⬆️ —- ⏳ 1 H —- #math #ml #ai #machinelearning #artificialintelligence #DataScience #computerscience"
83.9K views
7 months ago
Instagram
0:59
Gradient Descent optimizer in Deep Learning
4.7K views
Dec 6, 2023
YouTube
OurSubject
Stochastic Gradient Descent (SGD) is the heart of machine learning optimization! By updating weights in small steps, it ensures faster and efficient learning for models. Simplicity with power! 🚀 #SGD #MachineLearning #Optimization
202 views
9 months ago
TikTok
yazilimciabi
Batch Gradient Descent Weight Updates 🏋️ - Deep Learning Beginner 👶 📈 - Topic 102
939 views
Mar 17, 2024
YouTube
deeplizard
Gradient Descent The Heart of Neural Network Optimization Explained
34 views
7 months ago
YouTube
Chain
Deeply AI: AI, ChatGPT, Robotics, Data Science on Instagram: "Gradient descent is an optimization method in deep learning that helps a model improve its predictions by adjusting its weights and biases. It works by measuring the error between the actual and predicted values using a cost function. The algorithm calculates the gradient (slope) of the cost function and updates the parameters by moving in the opposite direction of the gradient. The size of each step is controlled by a learning rate.
5K views
6 months ago
Instagram
deeply.ai
Training AI Models: From Data to Decisions #ai #technology #trending #trendingshorts #viralvideo
40 views
Aug 27, 2024
YouTube
Neural Nexus
Gradient Descent visualized #datascience #machinelerning #deeplearning #ai #math
3.6K views
7 months ago
YouTube
Giffah
Understanding Gradient Descent in AI
93 views
6 months ago
YouTube
Synaptigon
Gradient Descent Vs Stochastic Gradient Descent
401 views
Mar 25, 2024
YouTube
EduFlair KTU CS
1:00
How Stochastic Indicator Works (Learn It in 1 Minute!)
20.3K views
Nov 7, 2020
YouTube
FXDavid
Artificial Intelligence | AI on Instagram: "Gradient descent is a fundamental optimization algorithm used by most AI models to learn from data by minimizing a loss function, which measures how far the model’s predictions are from the true values. Conceptually, it treats the loss function as a landscape (we call this the loss landscape) with peaks and valleys representing high and low errors. At any point on this landscape, the gradient (vector of slopes) indicates the direction and steepness of
20.6K views
2 months ago
Instagram
1:17
Instagram
5.2K views
5 months ago
Instagram
0:07
Ashish Singh on Instagram: "Stochastic Gradient Descent out here skipping the queue, making educated guesses, and still reaching the loss minimum faster than your weekend plans! 😅 #StochasticGradientDescent #MachineLearningHumor #DeepLearning #BioinformaticsJokes #MiniBatchMadness #OptimizerOnTheLoose #LossFunctionChronicles #MLLife #FastButFuzzy"
12.1K views
1 month ago
Instagram
AI | Machine Learning | Tech on Instagram: "Stochastic Gradient Descent (SGD) is an optimization method used to minimize a function, usually a loss function, to improve a model. Think of it like trying to get to the lowest point of a hill: the gradient represents the slope, telling you which direction to move to go down. Instead of looking at the entire hill (or dataset), SGD picks small random subsets of data at each step, calculating the gradient based on that subset. This makes updates faster
5.6K views
2 months ago
Instagram
1:14
AI • Machine Learning • Tech on Instagram: "Gradient descent is an optimization algorithm widely used in machine learning to minimize a loss function, which is a measure of how well a model’s predictions match the actual outcomes. In the gradient descent process, the model iteratively adjusts its parameters (its weights and biases) to reduce the loss. The parameters are adjusted based on the gradient, or partial derivatives, of the loss function with respect to each parameter. The gradient point
50.3K views
3 months ago
Instagram
aibutsimple
e-Train Brain Academy on Instagram: "Gradient Descent is an optimization algorithm used to minimize the loss (or error) in machine learning models by updating the model’s parameters (like weights) in the direction that reduces the loss. 🧠 How It Works (in simple terms): Imagine you're standing on a hill and want to get to the lowest point in the valley (the minimum of a function). You look around, figure out the steepest downward direction (the negative gradient), and take a small step that way
1.8K views
1 month ago
Instagram
Insightforge | Data Science & AI on Instagram: "Gradient Descent: how AI models actually learn At the core of most AI training lies gradient descent, an algorithm that minimizes a loss function — the measure of how far predictions are from the truth. Think of the loss function as a landscape of hills and valleys: High peaks = big errors Low valleys = small errors The gradient shows the steepest uphill direction. Gradient descent simply moves the opposite way — downhill — step by step, adjusting
1.9K views
1 week ago
Instagram
1:28
AI • Machine Learning • Tech on Instagram: "In deep learning, gradient descent is an optimization algorithm (or optimizer) that adjusts the model’s weights and biases iteratively to minimize some cost function. This cost function is used to measure the error between the correct label and the model’s prediction. By computing the gradient of the cost function with respect to these parameters, the algorithm computes the direction of steepest descent. To reduce the cost, it takes small “steps” in th
39.6K views
7 months ago
Instagram
0:47
Instagram
40.8K views
Aug 21, 2024
Instagram
aibutsimple
AI • Machine Learning • Tech on Instagram: "Gradient descent is a fundamental optimization algorithm used by most AI models to learn from data by minimizing a loss function, which measures how far the model’s predictions are from the true values. Conceptually, it treats the loss function as a landscape (we call this the loss landscape) with peaks and valleys representing high and low errors. At any point on this landscape, the gradient (vector of slopes) indicates the direction and steepness of
154.9K views
2 months ago
Instagram
Artificial Intelligence | AI on Instagram: "Gradient descent is a fundamental optimization algorithm used by most AI models to learn from data by minimizing a loss function, which measures how far the model’s predictions are from the true values. Conceptually, it treats the loss function as a landscape (we call this the loss landscape) with peaks and valleys representing high and low errors. At any point on this landscape, the gradient (vector of slopes) indicates the direction and steepness of
4.9K views
1 week ago
Instagram
AI | Machine Learning | Tech on Instagram: "In deep learning, gradient descent is an optimization algorithm (or optimizer) that adjusts the model’s weights and biases iteratively to minimize some cost function. This cost function is used to measure the error between the correct label and the model’s prediction. By computing the gradient of the cost function with respect to these parameters, the algorithm computes the direction of steepest descent. To reduce the cost, it takes small “steps” in th
3.8K views
2 months ago
Instagram
AI | Machine Learning | Tech on Instagram: "Gradient descent is an optimization algorithm that minimizes a function by iteratively stepping toward its lowest value. It works by taking steps proportional to the negative gradient (slope) of the function at the current point, moving in the direction of steepest descent. This process continues until the function reaches its minimum or the steps become negligibly small. Credit: @3blue1brown #machinelearning #deeplearning #math #neuralnetwork #datasci
2.3K views
1 month ago
Instagram
Neural AI on Instagram: "Gradient descent is a fundamental optimization algorithm used by most AI models to learn from data by minimizing a loss function, which measures how far the model’s predictions are from the true values. Conceptually, it treats the loss function as a landscape (we call this the loss landscape) with peaks and valleys representing high and low errors. At any point on this landscape, the gradient (vector of slopes) indicates the direction and steepness of the fastest increas
2.9K views
2 months ago
Instagram
1:17
AI • Machine Learning • Tech on Instagram: "Stochastic Gradient Descent (SGD) is an optimization method used to minimize a function, usually a loss function, to improve a model. Think of it like trying to get to the lowest point of a hill: the gradient represents the slope, telling you which direction to move to go down. Instead of looking at the entire hill (or dataset), SGD picks small random subsets of data at each step, calculating the gradient based on that subset. This makes updates faster
41K views
5 months ago
Instagram
Artificial Intelligence | AI on Instagram: "Backpropagation is the algorithm used to compute gradients in neural networks, making it a crucial component of gradient descent. It works by applying the chain rule of calculus to propagate the error from the output layer back through the network, calculating how much each weight contributed to the total loss. Once these gradients are computed, they are used in the gradient descent update step: each weight is adjusted by subtracting the gradient multi
17.7K views
3 months ago
Instagram
Artificial Intelligence | AI on Instagram: "Stochastic Gradient Descent (SGD) is an optimization method used to minimize a function, usually a loss function, to improve a model. Think of it like trying to get to the lowest point of a hill: the gradient represents the slope, telling you which direction to move to go down. Instead of looking at the entire hill (or dataset), SGD picks small random subsets of data at each step, calculating the gradient based on that subset. This makes updates faster
4K views
4 months ago
Instagram
Artificial Intelligence | AI on Instagram: "Stochastic Gradient Descent (SGD) is an optimization method used to minimize a function, usually a loss function, to improve a model. Think of it like trying to get to the lowest point of a hill: the gradient represents the slope, telling you which direction to move to go down. Instead of looking at the entire hill (or dataset), SGD picks small random subsets of data at each step, calculating the gradient based on that subset. This makes updates faster
21K views
1 month ago
Instagram
See more videos
More like this
Feedback