mesamune@lemmy.world to Technology@lemmy.worldEnglish · 11 months agoThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.comexternal-linkmessage-square86fedilinkarrow-up1483arrow-down113cross-posted to: futurology@lemmy.worldtechnews@radiation.partyartificial_intel@lemmy.mltechnology@lemmy.mltechnology@beehaw.org
arrow-up1470arrow-down1external-linkThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.commesamune@lemmy.world to Technology@lemmy.worldEnglish · 11 months agomessage-square86fedilinkcross-posted to: futurology@lemmy.worldtechnews@radiation.partyartificial_intel@lemmy.mltechnology@lemmy.mltechnology@beehaw.org
minus-squaredanielbln@lemmy.worldlinkfedilinkEnglisharrow-up4·11 months agoMicrosoft’s Phi model was largely trained on synthetic data derived from GPT-4.
minus-squaregapbetweenus@feddit.delinkfedilinkEnglisharrow-up1·edit-211 months agoI’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.
Microsoft’s Phi model was largely trained on synthetic data derived from GPT-4.
I’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.