Wilshire@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.comexternal-linkmessage-square61fedilinkarrow-up1214arrow-down116cross-posted to: tech
arrow-up1198arrow-down1external-linkThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.comWilshire@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square61fedilinkcross-posted to: tech
minus-squareabcdqfr@lemmy.worldlinkfedilinkEnglisharrow-up1·4 months agoIntriguing. Is that an 8gb card? Might have to try this after all
minus-squareRandomLegend [He/Him]@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1·4 months agoYup, 8GB card Its my old one from the gaming PC after switching to AMD. It now serves as my little AI hub and whisper server for home assistant
minus-squareabcdqfr@lemmy.worldlinkfedilinkEnglisharrow-up1·4 months agoWhat the heck is whisper? Ive been fooling around with hass for ages, haven’t heard of it even after at least two minutes of searching. Is it openai affiliated hardwae?
minus-squareRandomLegend [He/Him]@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·4 months agowhisper is an STT application that stems from openAI afaik, but it’s open source at this point. i wrote a little guide on how to install it on a server with an NVidia GPU and hw acceleration and integrate it into your homeassistant after. https://a.lemmy.dbzer0.com/lemmy.dbzer0.com/comment/5330316 it’s super fast with a GPU available and i use those little M5 ATOM Echo microphones for this.
Intriguing. Is that an 8gb card? Might have to try this after all
Yup, 8GB card
Its my old one from the gaming PC after switching to AMD.
It now serves as my little AI hub and whisper server for home assistant
What the heck is whisper? Ive been fooling around with hass for ages, haven’t heard of it even after at least two minutes of searching. Is it openai affiliated hardwae?
whisper is an STT application that stems from openAI afaik, but it’s open source at this point.
i wrote a little guide on how to install it on a server with an NVidia GPU and hw acceleration and integrate it into your homeassistant after. https://a.lemmy.dbzer0.com/lemmy.dbzer0.com/comment/5330316
it’s super fast with a GPU available and i use those little M5 ATOM Echo microphones for this.