@NetworkChuck

šŸ›  Build your own AI Hub!! Run OpenWebUI on your own VPS with Hostinger (code networkchuck10): https://hostinger.com/networkchuckvps 


AI is getting expensive…but it doesn’t have to be. I found a way to access all the major AI models– ChatGPT, Claude, Gemini, even Grok – without paying for multiple expensive subscriptions. Not only do I get unlimited access to the newest models, but I can also share it with my entire team, my wife, my kids– all while keeping full control over what they can access. Better security, more privacy, and a ton of features… this might be the best way to use AI.




RESOURCES:

šŸ”§ Self-host OpenWebUI (On-Prem Installation Guide): https://youtu.be/JJ_0-pAOIEk
🌐 Turn OpenWebUI into a website (Domain + SSL Setup): https://youtu.be/BdH_yR_J3FQ


TIMESTAMPS:

0:00 - Intro
1:11 - The Plan (What is OpenWebUI?)
2:20 - The Cloud Option
5:02 - Install OpenWebUI
6:18 - Connecting ChatGPT API
9:06 - How Much Does This Cost?
13:06 - Using LiteLLM to do MORE








šŸ”„šŸ”„Join the NetworkChuck Academy!: https://ntck.co/NCAcademy 



**Sponsored by Hostinger

@TheMCFisk

Hey chuck. I think the title is a little misleading here, you make it seem like this is about running ai home and not paying for subscriptions but what the video is actually about is running the webui interface that you use to connect to external ai providers. it says cheaper+unlimited access. But realistically thats only true if youre using ollama on good hardware otherwise you're still paying for access to an external ai provider. 

Id really recommend calling it something like ACCESS ALL YOUR AI IN ONE PLACE {insert the name of the front end software}

@RonRonRonRonAway

Please add a follow-up with this interface, but with using only free AI models. And showcase the advantages of using free vs paid. Privacy being the biggest concern.

@santi-leoni

Great vid thanks for the walkthrough. A tip for everyone who won't care paying a little more for an easy setup. You can get an OpenRouter key, add it as an "OpenAI" connection on Open WebUI, and you're good to go with all the models they support.

@user-tk7sc4gz2v

You started the video with self host then you added bunch of subscriptions n cloud apis. I'll use the service directly and ditch $2 savings n the complexity.

@RFLTools

Just a note:  OpenAI credits expire after a year of non-use.  So if you use it rarely I'd suggest keeping the refills to $5.

@ThomasMeli

The API is not cheaper if you use the frontier models as much as I do, but this is still cool and good to know.

@DIYDaveOK

I'm honestly confused. I initially thought this was self-hosted, but one of the first steps was to open an account on a cloud provider for a VM? Maybe I missed something? If im not running it locally, thats not really self-hosted

@dagda.gaming

I absolutely love your videos! I recently set this up so my kids can have monitored access to Ai to help with school work. Did you revise your system prompt after this video was taken? I want to ensure I put sufficient safeguards in place.

@gamerguy9533

100%!!! Cover more about Open WebUI. Such a powerful application.

@RAMIEID1

A follow-up video on how to update OpenWebUI without losing the data would be a great video. All in all excellent work.

@AhmedMoussa147

This is the first time I've watched a video of you that I didn't like

1. Misleading title (clickbait)
2. No one wants to use a dump AI so all of us are using the smartest models which means the subscription per month is cheaper instead of the APIs (I understand the use of astrics in the video and your use case is very specific for your needs)
3. Loved the new video editing effect of having UI visible and integrated into the video (I'm assuming the final cut is being used here)
4. Please don't start doing videos for the sake of just making a video (Don't fall into the same mistake that Linus from LTT)
5. The Hostinger hosting won't work for any locally run models as the Server is extremely weak (hardware-wise) to run even the mid-level models

I (like many others) enjoy your videos and have been here since the beginning, so I had to say something.

I don't mean to be disrespectful in any way. I just like your content very much and what you are doing that I felt the need to say something. I honestly hope you at least take what I said with a grain of salt and under advisement.

@christopherecrawford1406

Hey Chuck… been enjoying your content for about 2-3 yrs now. When you mentioned Tim, ā€œThe Toolmanā€ Taylor made me laugh because I agree and also started watching from Episode 1. Keep up the great work Chuck! …More Power!!!!!

@arshamskrenes

Unlike most here, I'm very grateful for this video as I wanted to add additional providers to Open Web UI (which I started using because of you) and found this extremely helpful. The only thing I wish you added is how to set budgets PER USER or per group, rather than per provider. I'd like to share accounts with extended family, but have them on a limited budget and rate limit than for myself. Thanks for the great video!

@williamedds1

Another video on Open-WebUI.... PLEASE!! This is very awesome stuff and you and your team are awesome at explaining things. Thank you so much!

@SmollVoice

You are an absolute legend. For years back end development scared me away, and I became a creative designer because of it. I have a very logic and math orientated thought process however. So, being able to learn from you and actually have fun delving into this all is something I can't express my gratitude enough for. 
Thank you for being you.

@ThomasEmminger

Great video! However, I think the title is a bit misleading. At first, I thought this video would be about hosting an LLM model yourself, without the need for subscriptions.

@joeltb

Yes, please do another video that expands upon this stuff. Super interesting and your videos are always engaging so I am learning....

@danielcamposramos9943

Oh for sure I want a complete series breaking down all things!

@matthewbond375

LiteLLM is the secret sauce I was missing on my OpenWebUI instance. Now I can compare local model results to bigger models on Grok. Thanks Chuck!