Views : 25,236
Genre: Education
Date of upload: Apr 19, 2024 ^^
Rating : 4.762 (29/459 LTDR)
RYD date created : 2024-05-16T06:07:22.975158Z
See in json
Top Comments of this video!! :3
I have ollama on my computer and I am currently using it to run AI models through Python. I need to correct complex instructions that I can only run with the 70B model, the problem is that due to its complexity it takes a long time to execute (2 minutes), how can I lower the times? Currently the model runs on the CPU, how can I configure ollama to use the GPU?
|
@sanadasaradha8638
3 weeks ago
Instead of showing all new models it is better to implement a single open source llm for all use cases including fine tuning. At the same time it is better to make an end to end project with opensource llm.
20 |