Note book to help you and me remember some tricks

Category AI

VSCode + Free LLM model on private server with or without GPU (CPU-only)

Prerequisites Private server with 8 or 12 cores (8 or up to 12 virtual CPU if your server is a virtual machine). Install Ollama + LLM models To easily manager LLM model you can install Ollama. It’s easier to manage… Continue Reading →

© 2025 Software engineer >> Sysadmin >> Devops >> SRE — Powered by WordPress

Theme by Anders NorenUp ↑