Hey chuck. I think the title is a little misleading here, you make it seem like this is about running ai home and not paying for subscriptions but what the video is actually about is running the webui interface that you use to connect to external ai providers. it says cheaper+unlimited access. But realistically thats only true if youre using ollama on good hardware otherwise you're still paying for access to an external ai provider. Id really recommend calling it something like ACCESS ALL YOUR AI IN ONE PLACE {insert the name of the front end software}
Please add a follow-up with this interface, but with using only free AI models. And showcase the advantages of using free vs paid. Privacy being the biggest concern.
Great vid thanks for the walkthrough. A tip for everyone who won't care paying a little more for an easy setup. You can get an OpenRouter key, add it as an "OpenAI" connection on Open WebUI, and you're good to go with all the models they support.
You started the video with self host then you added bunch of subscriptions n cloud apis. I'll use the service directly and ditch $2 savings n the complexity.
Just a note: OpenAI credits expire after a year of non-use. So if you use it rarely I'd suggest keeping the refills to $5.
The API is not cheaper if you use the frontier models as much as I do, but this is still cool and good to know.
I'm honestly confused. I initially thought this was self-hosted, but one of the first steps was to open an account on a cloud provider for a VM? Maybe I missed something? If im not running it locally, thats not really self-hosted
I absolutely love your videos! I recently set this up so my kids can have monitored access to Ai to help with school work. Did you revise your system prompt after this video was taken? I want to ensure I put sufficient safeguards in place.
100%!!! Cover more about Open WebUI. Such a powerful application.
A follow-up video on how to update OpenWebUI without losing the data would be a great video. All in all excellent work.
This is the first time I've watched a video of you that I didn't like 1. Misleading title (clickbait) 2. No one wants to use a dump AI so all of us are using the smartest models which means the subscription per month is cheaper instead of the APIs (I understand the use of astrics in the video and your use case is very specific for your needs) 3. Loved the new video editing effect of having UI visible and integrated into the video (I'm assuming the final cut is being used here) 4. Please don't start doing videos for the sake of just making a video (Don't fall into the same mistake that Linus from LTT) 5. The Hostinger hosting won't work for any locally run models as the Server is extremely weak (hardware-wise) to run even the mid-level models I (like many others) enjoy your videos and have been here since the beginning, so I had to say something. I don't mean to be disrespectful in any way. I just like your content very much and what you are doing that I felt the need to say something. I honestly hope you at least take what I said with a grain of salt and under advisement.
Hey Chuck⦠been enjoying your content for about 2-3 yrs now. When you mentioned Tim, āThe Toolmanā Taylor made me laugh because I agree and also started watching from Episode 1. Keep up the great work Chuck! ā¦More Power!!!!!
Unlike most here, I'm very grateful for this video as I wanted to add additional providers to Open Web UI (which I started using because of you) and found this extremely helpful. The only thing I wish you added is how to set budgets PER USER or per group, rather than per provider. I'd like to share accounts with extended family, but have them on a limited budget and rate limit than for myself. Thanks for the great video!
Another video on Open-WebUI.... PLEASE!! This is very awesome stuff and you and your team are awesome at explaining things. Thank you so much!
You are an absolute legend. For years back end development scared me away, and I became a creative designer because of it. I have a very logic and math orientated thought process however. So, being able to learn from you and actually have fun delving into this all is something I can't express my gratitude enough for. Thank you for being you.
Great video! However, I think the title is a bit misleading. At first, I thought this video would be about hosting an LLM model yourself, without the need for subscriptions.
Yes, please do another video that expands upon this stuff. Super interesting and your videos are always engaging so I am learning....
Oh for sure I want a complete series breaking down all things!
LiteLLM is the secret sauce I was missing on my OpenWebUI instance. Now I can compare local model results to bigger models on Grok. Thanks Chuck!
@NetworkChuck