@[email protected] to TechnologyEnglish • 12 hours agoAI chatbots unable to accurately summarise news, BBC findswww.bbc.comexternal-linkmessage-square40fedilinkarrow-up1229arrow-down14cross-posted to: [email protected]
arrow-up1225arrow-down1external-linkAI chatbots unable to accurately summarise news, BBC findswww.bbc.com@[email protected] to TechnologyEnglish • 12 hours agomessage-square40fedilinkcross-posted to: [email protected]
minus-square@[email protected]linkfedilinkEnglish10•edit-211 hours agoThey are, however, able to inaccurately summarize it in GLaDOS’s voice, which is a strong point in their favor.
minus-squareJackGreenEarthlinkfedilinkEnglish3•11 hours agoSurely you’d need TTS for that one, too? Which one do you use, is it open weights?
minus-square@brucethemooselinkEnglish1•edit-211 hours agoZonos just came out, seems sick: https://huggingface.co/Zyphra There are also some “native” tts LLMs like GLM 9B, which “capture” more information in the output than pure text input.
minus-square@ag10nlinkEnglish2•edit-29 hours agoA website with zero information, and barely anything on their huggingface page. What’s exciting about this? Ahh, you should link to the model https://www.zyphra.com/post/beta-release-of-zonos-v0-1
minus-square@brucethemooselinkEnglish1•7 hours agoWhoops, yeah, should have linked the blog. I didn’t want to link the individual models because I’m not sure hybrid or pure transformers is better?
They are, however, able to inaccurately summarize it in GLaDOS’s voice, which is a strong point in their favor.
Surely you’d need TTS for that one, too? Which one do you use, is it open weights?
Zonos just came out, seems sick:
https://huggingface.co/Zyphra
There are also some “native” tts LLMs like GLM 9B, which “capture” more information in the output than pure text input.
A website with zero information, and barely anything on their huggingface page. What’s exciting about this?
Ahh, you should link to the model
https://www.zyphra.com/post/beta-release-of-zonos-v0-1
Whoops, yeah, should have linked the blog.
I didn’t want to link the individual models because I’m not sure hybrid or pure transformers is better?
Looks pretty interesting, thanks for sharing it