• NVIDIA released a demo version of a chatbot that runs locally on your PC, giving it access to your files and documents.

• The chatbot, called Chat with RTX, can answer queries and create summaries based on personal data fed into it.

• It supports various file formats and can integrate YouTube videos for contextual queries, making it useful for data research and analysis.

    • Dojan
      link
      English
      199 months ago

      There were CUDA cores before RTX. I can run LLMs on my CPU just fine.

    • halfwaythere
      link
      English
      6
      edit-2
      9 months ago

      This statement is so wrong. I have Ollama with llama2 dataset running decently on a 970 card. Is it super fast? No. Is it usable? Yes absolutely.