Title | : | StatQuest: K-means clustering |
Lasting | : | 8.31 |
Date of publication | : | |
Views | : | 1,3 jt |
|
The first time in 5 years that i actually understood ML algorithms clearly and actually enjoy them now Comment from : GIGI |
|
This is brilliantly explained, thank you! It still looks somewhat similar to KNN, confusing to understand which and when to use :/ Comment from : Somish |
|
Hi sir is k means and kneighborhood algorithms are same ? Comment from : Beradinho Yilmaz |
|
can anyone please explain what the need is to convert the calculated distance from the last nearest centroid to probability distribution,instead of finding the next centroid just by calculating the distance (in KMeans++) Comment from : Madhu Varshini |
|
Great video, but I haven't understood what you mean by variation when explaining how to pick a velue for k, could anyone explain please? Comment from : Wast |
|
Thanks, Great Video 👍👍 Comment from : erfan am kh |
|
Nice explanation ❤❤❤ Comment from : Rubana hoque chowdhury |
|
Does cluster analysis have to start with a multicollinearity test? Comment from : Ante Achmad |
|
One of the best and simplest explanations of K-means clustering!! Comment from : Mohamad Suhaib Abdul Rahman |
|
i love it Comment from : alex zaznobin |
|
Great explanation!brMany thanks 👍 Comment from : Abdullah Muhammad |
|
Wow, this is K means to the other levelbrIf you have a video about the 'Distributed Hash Tables" please let me know Comment from : Tesfaye Yimam |
|
Isn't tsne plot a type of clustering too? Comment from : Kartik Malladi |
|
Excellent explanation! Comment from : Gil R |
|
you are Genius:) Comment from : Aleksey |
|
How to cluster text data Comment from : Madhavi Bauskar |
|
This is by far tbe best intro oftge series, short , focused and realted Comment from : iHisam |
|
Thank you for my lifebr- tired student studying for AI final Comment from : whatchyagonnado |
|
Is choosing the initial data points randomly, the best option we have? I can't help but think that random would be very inefficient Comment from : Mars Park |
|
great Comment from : Jaskaran Singh |
|
Great content and quite simplified well, thanks Comment from : Wiza Phiri |
|
This felt like rocket science until today!! thanks! Comment from : Tega Obarakpor |
|
Great video, Thank you Comment from : Chinthaka Liyana Arachchi |
|
you explained in 8 minutes what my prof attempted to do in 2 hrs You are the best!!!!! Comment from : Kam Her |
|
what do you mean by variation? do you have a video to explain it? Comment from : Ben Lechner |
|
Sounds like there should be an upgrade to this technique while the location of K points is random only in the first step, after that, there may be used some kind of gradient descent right? Comment from : Igor G |
|
This is by far the clearest explanation I’ve seen Great video! Comment from : David J |
|
Thank you for explaining this! Comment from : The 1 Kid Couple |
|
Maa saraswati ka ashirwad hai aap par 🙏 Comment from : Abhishek Bendre |
|
you forgot talking about "Outliers" Comment from : Bilal shawky |
|
Loved every min of video Sir!!brJust studied a day before exam & real glad da’t I did 😌 Comment from : Rocky Etchison |
|
what a great video Comment from : Hashim Mahmood |
|
the thumbnail wasn't a clickbait, its really clearly explained! thank you sir Comment from : Mustapha Batoot |
|
What if my numeric data is on different scales? Wouldnt that confuse the way it identifies the nearest point? Do I need to scale all my data first? Comment from : J94 |
|
BEST EVER I CAN FIND ON INTERNET THANK YOU Comment from : Linn Htuts WORLD |
|
If you ever teach shell scripting you should replace the "bam" with "shabang" or #! Comment from : Undine |
|
Best Intro ever! Comment from : Đạt Dương |
|
Amazing video Thank you Loved it 🙌 At the end, for distances with 3/4 dimensions, shouldn’t those be cube root/fourth root? Comment from : Bhavdeep |
|
This is so wholesome, informative, and engaging all at the same time Thank you so much for this! brbrLove the intro tune btw Comment from : Jared Cortes |
|
Bam?! Comment from : Tushar Jain |
|
Great video, thank you so much Keep it up with the amazing content Comment from : Quang Anh |
|
amazing and clear explanation ! horrible song tho Comment from : aymen hammami |
|
The video was really understandable! But how do you calculate the variation? Comment from : Neg the trainer |
|
You are genius sir! I wish you were my teacher when I was in my graduation brThank you 💌 Comment from : DEEPAK RAWAT |
|
GREAT video! Shame there's cringe at the beginning :D Comment from : Povilas Marcinkevicius |
|
Josh: Bam???brMe: Damn…that’s beautifully explained 😢 Comment from : RBG02005 |
|
Thanks for the explanation, I finally understood this method Comment from : MicroStick69 |
|
6:32 Comment from : Rachibe Liegise |
|
THANK YOU SOO MUCH!! Arigato gozaimus Comment from : Kofi Jr |
|
Could you please share the same github link for python code? Comment from : muskan rath |
|
I got a campus placement as a Data Scientist and as of yet I've been in the industry for 45 years I am knee deep in everything data science barring ML and AI related intensive coding because I only know the algos at a bird's eye view and my regular work doesn't entail me needing a whole lot of things But I sat down today to start from K-means and boy am I happy I found this! Such a confidence boost when you understand something really well I don't even need packages to implement this algo should there ever be a need to do so! Thank you! Comment from : JITHU NAIR |
|
Thank you so much! Comment from : /// |
|
3 minutes into the video, I got it Great video Keep up the good work Comment from : Henry Tirla |
|
Thank you for this video, helped me a lot! I couldn't find a video on your channel about fuzzy c-means, but do you have one? Maybe I just couldn't find it Comment from : Jente Meijer |
|
Hi Joshi just love the way you explain thingsI request you to upload video on fuzzy cmeans clustering as well Comment from : mona saeed |
|
Your videos are so good, we actually use them in AI Class in University haha We are doing the reversed Classroom method and the Prof just linked this video for K-Means and we discussed it later Of course with additional material, but usually he does his own short videos Comment from : Mekkes |
|
Your videos are so good! They are calm, without being boring, in the exact rythm to understand the concept without getting tired of it, and doing it so perfectly is an art Congratulations StatQuest, you got another sub! Comment from : Portho Games BR |
|
What if euclidean distance between a point and 2 central points equal? Comment from : kelvin maumba |
|
Hi Josh, do you have any video for Silhouette Method? So far all explanations I found are very poor Comment from : Pablo Paiva |
|
Nicely explained, "Question?" - Dwight much ? 😅 Comment from : sunil patra |
|
Can you do a video on Block Clustering? Would love to see you do one specifically on that Comment from : W |
|
You won me over with the intro Comment from : Storm Robinson |
|
Your video is sooooo clear!!!!!!!!! Thx! Comment from : Yibing Jiang |
|
What a beautiful Intro Josh <3 Comment from : Hannah Bergeron |
|
Thanks for the clearly explanation What if there are two or many same distances from centre points to other points Comment from : Chamara jayanath |
|
in case of the YX axis case, mean is the mean of the values of the cluster or the centre point of the cluster on the 2D axis ? Comment from : Abdul Sami |
|
i can understand this clearly thanks for your video Comment from : Huy Trần |
|
Your videos are so clear and well explained, thank you so much for this content Comment from : Luis Talavera |
|
AWESOME SONG BROObrStat Quest! Comment from : Better Call Haroon |
|
have you worked as a data scientist? if so is it as easy as this? I'm just finishing a masters Comment from : karrde666666 |
|
Such an amazing explanation!brEnjoyed it a lotbrThanks for the video Comment from : Parampreet Singh |
|
BAM Comment from : Jesse Wild Pol |
|
I have watched so many of your videos They are sooooo helpful My understanding of ML and stats is at a new level now thank you for making them You are the best!! could you also make some videos about time series and causal inference methodologies Comment from : Ariel Jiang |
|
this one has my favorite statquest melody! Comment from : Ariel Jiang |
|
Dear Researcher, kindly guide, how can i cluster the questionnaire line items of the large data set like more than1000 observation?
brI have 79 final line items of questionnaire now i want to cluster the line items into distinct latent variable kindly guide me how can i cluster the line items thanks in anticipation Comment from : muhammad qasim |
|
Thank you Nobody can explain this better! Just watched this one video and I hit 'Subscribe' Comment from : Muthu |
|
StatQuest!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Comment from : Some Call Me, Tim |
|
This is awesome Thanks! Comment from : Fritschge128 |
|
Hi Josh, I guess the representation elbow plot is incorrect Since the reduction in variance (within each cluster) is indirectly proportional to no of cluster The slope must be negative rather than positive 🧐 Comment from : Amir Ali |
|
Great explanation! thank you very much :) Comment from : Shamanth Raj Reddy |
|
why do almost everyone goes with k means not DBSCAN or Hierarchical ? Is there a precedence of an algorithm over other? Comment from : Amir Ali |
|
should we standarize features on k-means? Comment from : Nabiila _ |
|
key-nay-e-viritz'wors-parse-ebul-taysts!!'inno Comment from : steph clements |
|
Thanks! Comment from : Mehul Patel |
|
fantastic, by the way i like your tone, it is not boring Comment from : Mr J |
|
very informative Comment from : vibii |
|
Who is watching this 1 day before the final exams? Comment from : Gatsby Liu |
|
Do you have an idea about assumptions in the case of large variations? Can you explain it? Comment from : dataanalyst101 |
|
At the start of the video , Bam? After the end, ohhhhhhh BAM!!! Thanks to you, I am ready for my interview Comment from : Hariharan Venkatesan |
|
Thank you so much for your awesome videos!brbrCould you please make a video for Gaussian Mixture Models too? Comment from : DEEPAK SV |
|
THANK YOU I LOVE YOU <3 Comment from : Stella Friaisse |
|
Do we have a video about Gaussian Mixture Models and EM on StatQuest, please Comment from : Daisy W |
|
Bam? 😅 Comment from : setiawan aji |
|
You make my research almost complete Thanks for useful and clearly explanation <3 Comment from : Patipon W |
|
Hey teacher, I follow you from Algeria I ask you to translate what you write into Arabic so that I can understand you because I do not know English 😭😭 Comment from : Widad Kouache |
Three Clustering Algorithms You Should Know: k-means clustering, Spectral Clustering, and DBSCAN РѕС‚ : Dr. Data Science Download Full Episodes | The Most Watched videos of all time |
Shape Analysis (Lecture 20): Segmentation and clustering (k-means, Frechet means, normalized cuts) РѕС‚ : Justin Solomon Download Full Episodes | The Most Watched videos of all time |
StatQuest: Hierarchical Clustering РѕС‚ : StatQuest with Josh Starmer Download Full Episodes | The Most Watched videos of all time |
2. K-means Clustering: Types, Applications, and Distance Measures | Machine Learning Algorithms РѕС‚ : CodersArts Download Full Episodes | The Most Watched videos of all time |
Statistical Learning: 12.3 k means Clustering РѕС‚ : Stanford Online Download Full Episodes | The Most Watched videos of all time |
Clustering: K-means and Hierarchical РѕС‚ : Serrano.Academy Download Full Episodes | The Most Watched videos of all time |
StatQuest: PCA main ideas in only 5 minutes!!! РѕС‚ : StatQuest with Josh Starmer Download Full Episodes | The Most Watched videos of all time |
Learnable Similarity Functions and Their Applications in Information Integration and Clustering РѕС‚ : Microsoft Research Download Full Episodes | The Most Watched videos of all time |
6.2.7 An Introduction to Clustering - Video 4: Computing Distances РѕС‚ : MIT OpenCourseWare Download Full Episodes | The Most Watched videos of all time |
Algorithmic advances on metric and graph clustering (Part 1) Vincent Cohen-Addad (Google, Zurich) РѕС‚ : CSAChannel IISc Download Full Episodes | The Most Watched videos of all time |