High Definition Standard Definition Theater
Video id : tDRVUrcuqqo
ImmersiveAmbientModecolor: #b57d7d (color 2)
Video Format : (720p) openh264 ( https://github.com/cisco/openh264) mp4a.40.2 | 44100Hz
Audio Format: 140 ( High )
PokeEncryptID: 1684c30fa4c0677f6ffc14be6f69642968dca6a8db48bde5f37e72f64e74fe9bf7a1af936fac3619ad833028ba9d9bda
Proxy : eu-proxy.poketube.fun - refresh the page to change the proxy location
Date : 1732465718947 - unknown on Apple WebKit
Mystery text : dERSVlVyY3VxcW8gaSAgbG92ICB1IGV1LXByb3h5LnBva2V0dWJlLmZ1bg==
143 : true
248,028 Views ā€¢ Jul 14, 2023 ā€¢ Click to toggle off description
Join the MDJ community: youtube.com/channel/UCrPhcbDwqWRc-3tteE2BS6g/join
Watch More! Chat GPT Playing Doctor: Ā Ā Ā ā€¢Ā ChatĀ GPT'sĀ MedicalĀ AdviceĀ isĀ SHOCKING!Ā Ā 

FOLLOW ME ON SOCIAL:
Instagram: www.instagram.com/mamadoctorjones
Twitter: www.twitter.com/mamadoctorjones
TikTok: www.tiktok.com/@mamadoctorjones

** The information in this video is intended to serve as educational information and is not intended or implied to be a substitute for professional medical advice, diagnosis or treatment. All content, including text, graphics, images, and information, contained in this video is for general information purposes only and does not replace a consultation with your own doctor/advanced practice provider. **
Metadata And Engagement

Views : 248,028
Genre: Education
License: Standard YouTube License
Uploaded At Jul 14, 2023 ^^


warning: returnyoutubedislikes may not be accurate, this is just an estiment ehe :3
Rating : 4.828 (889/19,809 LTDR)

95.70% of the users lieked the video!!
4.30% of the users dislieked the video!!
User score: 93.55- Overwhelmingly Positive

RYD date created : 2024-03-21T16:03:42.189837Z
See in json
Connections

497 Comments

Top Comments of this video!! :3

@Kaldylicious

1 year ago

My (computer engineer) dad says this all the time "garbage in, garbage out". Sometimes I think the scariest thing about AI is that it reflects the worst of ourselves.

2.8K |

@LoFiAxolotl

1 year ago

I used to work at a Camera Manufacturer... and we found this interesting thing... the Autofocus on Cameras in incredible at identifying White & Asian Men and Women... we couldn't figure out why it was struggling with BIPOC people... until we looked at the set of data we fed the algorithm... less than 2% where BIPOC people... AI is incredibly susceptible for bias...

1.7K |

@kikicogger2284

1 year ago

The problem is exasperated by the fact many female doctors arenā€™t addressed by their earned titles in media done about them. If ā€œDoctorā€ or ā€œDr.ā€ isnā€™t explicitly used in an article/video/etc., said media will not be included in the results ChatGP reports out, creating bias. The reason Dr. Mike was so high is because heā€™s always referred to as DR. Mike- not general practitioner or any other title. MamaDoctorJones, on the other hand, have shown several examples where her title isnā€™t used at all- ex. Obstetrician Jones instead of Dr. Jones. This discrepancy happens way too often and needs to be called out more.

186 |

@ildonoa3928

1 year ago

Recently graduated Software Engineer and this is an active discussion in the research end of the Computer Science spectrum. One avenue of research is using procedurally generated data sets instead of web scraping to train AI instances. This is STILL reflecting bias. It is important to continue to bring this issue to surface to drive grant dollars to fund research.

277 |

@Shadow1Yaz

1 year ago

This is absolutely correct. An AI being trained to detect skin cancer accidentally was taught that rulers are malignant. Most pictures of skin cancer have rulers in or measuring equipment in it, so pictures of perfectly healthy skin with a ruler in the picture would be flagged as cancerous.

800 |

@philospher77

1 year ago

An ad that I am getting a lot right now is a finance company asking AI to draw a picture of "someone good with money" (tweaked a couple of times to try and change the output). They say that less than 2% of the images were of women, even though (according to this company) women are better at finances and investing than men are. What you get out of AI depends highly on what data is used to train it.

266 |

@spinasoul

1 year ago

I remember someone made a thread on twitter a while back about messing up with it by mixing doctor and nurse, for example when specified "a nurse goes to a doctor, she was busy, who was busy?" It would answer the nurse 100% of the times, however when said "he was busy" it would answer the doctor

6 |

@JoanieBC

1 year ago

Focusing solely on the input from men has always been problematic, but it's even worse now with social media because men have no qualms about claiming expertise and speaking for others. Doesn't matter if they're right or wrong, their voices are just one part of the larger conversation. Women should always figure into the equation unless a journalist is asking for first-hand experience about prostate surgery and recovery or vasectomies. Even then, they should probably ask the women in the men's lives for input. Most of the he-man types (read: those who claim to be) were very likely begging for their partners to bring them fresh ice packs and maybe something special to eat. (I assisted with vasectomies and the reports from the men and their partners about recovery were VERY different.)

TLDR: do better, media! Women are part of this world, too.

101 |

@violet7773

1 year ago

I read that google translate (which has been using machine learning for the past number of years) had this problem translating from non-gendered languages (like Turkish) into English. Turkish doesn't have gendered pronouns (he/she are both "o"). So if you translated from Turkish "o bir doktor" it would always give you "he is a doctor" in english and "o bir hemşire" into "she is a nurse"
I just tried it now and they both translated into "she is X". But I think they only looked into it after people started writing news articles about it, so it sucks that it took until there was a public outcry for them to even notice that their machine learning model had a bias

I have a degree in computer science and it is such a huge problem. People say that computers can't be bigoted, and yes the computers themselves don't feel hatred/disgust towards oppressed groups, when a model is trained on biased data, biased data is coming out the other end. And a lack of diversity in the field of software engineering only exacerbates this issue, because people are generally bad at noticing their own blind spots

10 |

@dcflash8209

1 year ago

Or (and I am a woman) computers donā€™t care abt being nice nor do they care abt feelings. You asked it a question in which it answered with a fact. If you wanted specific parameters met then you should have imputed them. It probably looked at a lot of things such as views, likes, subscriptions, ect. You canā€™t really be mad if a certain criteria isnā€™t met by a certain party šŸ¤·ā€ā™€

5 |

@alicecold

1 year ago

Beware of correcting chatgpt thought! That often leads to the AI listing people that it made up on the spot to satisfy your request

48 |

@charlottesghost2845

1 year ago

yep - garbage in, garbage out. I can't believe that phrase isn't on repeat these days.

1 |

@gdehoyos006

1 year ago

She asked a question that implied ranking, itā€™s not AIs fault the boys are winning the competition ;)

1 |

@myeramimclerie7869

1 year ago

I just asked it the same question. Gave me 4 men, also Dr. Mike on top, and 1 women (Dr. Pimple Popper). I asked it why you're not on the list, it apologized for the oversight, agreed that you are in fact a prominent doctor on youtube and thanked me for bringing you to it's attention. lol šŸ˜„

28 |

@memefestus

1 year ago

I disagree with the assertion presented here. Reason being for anything that you ask ChatGPT the information cut off period is as of Sept 2021 (almost 2 years ago) and to just say "top" youtube Doctor is very vague because it could mean total views, total subscribers etc so this is much more nuanced.

7 |

@FixerFour

1 year ago

It's not just men, AI in general shows massive favoritism towards white cishet Christian men. One headhunting algorithm was given a list of "good resumes" to parse through and one of the conclusions it came to was "people named Chad are good to hire!"

327 |

@kiriki4558

1 year ago

I remember searching for female lead channels about videogames (because they are far less likely to hold hostile comunities) and there were not even mentions of them. You had to be lucky and dig deep to find them.
If the AI extract from an already biased pool then the results will be biased unless you ask for specific characteristics.
The top 5 most viewed will alway be men most of the time.

16 |

@pattyolson3842

1 year ago

Computers don't know anything on us. Dr. Mike is good. But you are definitely the best female doctor on YouTube. And maybe even better than Dr. Mike for your advocacy!

1 |

@ignisbad9158

1 year ago

Did he 5 doctors it gave you have higher metrics than the ones it did not, because it is relevant to whether or not it answered the question correctly

24 |

@alix7657

1 year ago

AI bias is an actual thing that any competent machine learning programmer actively tries to mitigate or minimize as best as they could. Your message In itself is right and being aware of AIā€™s potential biases and having a conversation about it is a good thing to do. However, not sure you picked the best example to demonstrate your point. Asking for the ā€œtopā€ YouTubers can easily be read by the AI as list most subscribed YouTubers or most well known YouTubers and with ChatGPTā€™s data set being capped in 2021 it could be that the reason it spit out 5 men isnā€™t because it is inherently bias against women YouTubers but that the question asked for the ā€œtopā€ and the top 5 happened to be men. So unless we analyze the numbers of all doctor YouTubers in 2021 we cannot say this was due to bias. More specific non quantitative questions would be better suited to try and assess bias.

87 |

Go To Top