Nvidia Chat With RTX AI Chatbot Runs Without Internet on Windows PC Minimum Requirement All Details

Nvidia released this tool on Tuesday, February 13. Ongoing Did. The company says that it is a personal AI chatbot. Users wishing to download the software will need a Windows PC or workstation equipped with an RTX 30 or 40-series GPU with a minimum of 8GB of VRAM. Once downloaded, the app can be installed with a few clicks and used immediately.
Because it is a local chatbot, Chat With RTX has no knowledge of the outside world. However, users can feed it with their personal data, such as documents, files, etc. and optimize it to run queries on them. One such use case might be to feed it a large feed of task-related documents and then ask it to summarize, analyze, or answer a specific question that could take hours to manually locate.
It supports text, pdf, doc/docx, and xml file formats. Additionally, the AI bot also accepts YouTube video and playlist URLs and using the transcription of the video it can answer questions or summarize the video. For this functionality, it will require Internet access.
According to the demo video, Chat with RTX is a web server with a Python instance that does not include large language model (LLM) information when freshly downloaded. Users can choose between the Mistral or Llama 2 model to train it and then use their own data to run queries.
The company says the chatbot leverages open-source projects like Retrieval-Augmented Generation (RAG), TensorRT-LLM, and RTX Acceleration for its functionality.