Using Cloudflare Tunnel to Public Ollama on the Internet

Using Cloudflare Tunnel to Public Ollama on the Internet

Daily short news for you
  • For a long time, I have been thinking about how to increase brand presence, as well as users for the blog. After much contemplation, it seems the only way is to share on social media or hope they seek it out, until...

    Wearing this shirt means no more worries about traffic jams, the more crowded it gets, the more fun it is because hundreds of eyes are watching 🤓

    (It really works, you know 🤭)

    » Read more
  • A cycle of developing many projects is quite interesting. Summarized in 3 steps: See something complex -> Simplify it -> Add features until it becomes complex again... -> Back to a new loop.

    Why is that? Let me give you 2 examples to illustrate.

    Markdown was created with the aim of producing a plain text format that is "easy to write, easy to read, and easy to convert into something like HTML." At that time, no one had the patience to sit and write while also adding formatting for how the text displayed on the web. Yet now, people are "stuffing" or creating variations based on markdown to add so many new formats that… they can’t even remember all the syntax.

    React is also an example. Since the time of PHP, there has been a desire to create something that clearly separates the user interface from the core logic processing of applications into two distinct parts for better readability and writing. The result is that UI/UX libraries have developed very robustly, providing excellent user interaction, while the application logic resides on a separate server. The duo of Front-end and Back-end emerged from this, with the indispensable REST API waiter. Yet now, React doesn’t look much different from PHP, leading to Vue, Svelte... all converging back to a single point.

    However, the loop is not bad; on the contrary, this loop is more about evolution than "regression." Sometimes, it creates something good from something old, and people rely on that goodness to continue the loop. In other words, it’s about distilling the essence little by little 😁

    » Read more
  • Alongside the official projects, I occasionally see "side" projects aimed at optimizing or improving the language in some aspects. For example, nature-lang/nature is a project focused on enhancing Go, introducing some changes to make using Go more user-friendly.

    Looking back, it resembles JavaScript quite a bit 😆

    » Read more

Problem

Hello readers of 2coffee.dev. Tet is just around the corner, have you prepared anything for yourself and your family yet? It seems to me that as the year ends, everyone gets busier. Since the beginning of the month, the traffic to the blog has decreased significantly. Sometimes it makes me anxious because I don't know where my readers have gone. Maybe they are taking an early Tet break, or the chatbot is too strong, or it could be due to the content not being engaging enough anymore. 😥

I must admit that in these last few weeks, I have been in the mindset of a busy person, not having much time to write regularly. It could be due to the nature of the job, combined with many issues to handle, so I no longer have the mental space to relax. But it’s okay; today I successfully configured Cloudflare Tunnel in conjunction with Ollama to "public" an API endpoint on the Internet - something I couldn't do a few weeks ago. I thought many people would need this, so I decided to write an article about it right away.

At first, I intended to write a short post in the Threads section, but then I realized it had been too long since I wrote a lengthy article, so I changed my mind. Can you believe it? A long article can be condensed into just a few short lines. Conversely, a short article can easily be made "flowery" enough to turn into a lengthy piece that many might dread. So why should one strive to write longer?

Wow! If I didn't say it, no one might know the reason. Writing is a way for me to relieve stress. By writing, I can connect with my readers, share, chat, or weave in stories and lessons I have learned. In other words, writing serves both as a form of relaxation and a means to interact with everyone.

Since launching the short article section Threads, I never expected so many people would be interested in it. Oh, but to say I didn't expect would be an exaggeration because I did a lot of research before implementing this feature. "Coding" a feature isn't hard; the challenge lies in how to operate it. Threads must ensure that the frequency of posts isn't interrupted; if I write an article infrequently, would anyone even come back to check for updates? This inadvertently creates pressure on how to both gather and summarize interesting and prominent news for readers. Many days I got too busy and forgot to write, and sure enough, the next day I had to publish a make-up post to keep my credibility intact. 😆

I know that many people enjoy reading, and I am one of those who loves writing. Sometimes reading isn't always in the mindset of being "chased by a deadline," on the way to find a solution, or learning something new... I believe that for many people, reading is similar to writing: it is for relaxation. Relaxing while gaining knowledge and experience is indeed a two-for-one deal, isn't it? 😁

I've talked too much already; let's get to the main point. Today I successfully configured Cloudflare Tunnel along with Ollama to publicize an API endpoint on the Internet. From there, anyone can access it without being confined to the local server (localhost) anymore. After reviewing the documentation for Ollama, it turned out to be simpler than I thought!

Cloudflare Tunnel & Ollama

If you don't know about Cloudflare Tunnel, please refer back to the article Adding a "Tunnel Locally" Tool - Bringing Local Servers to the Internet. This is a tool that helps us map local servers to the Internet, effectively turning your computer into a server that anyone with an IP address or domain name can access.

Ollama is a tool that allows us to run some large language models (LLMs) on our computers with just a single command. It simplifies the installation and usage of models. The standout feature is that it supports APIs compatible with OpenAPI.

In a previous article, I mentioned creating a Tunnel through a six-step process - a bit lengthy, right? In fact, Cloudflare Tunnel has a much quicker startup process, requiring only the installation of cloudflared and then using a single command:

$ cloudflared tunnel --url http://localhost:11434  
...  
Your quick Tunnel has been created! Visit it at (it may take some time to be reachable):  
https://tennis-coordination-korea-wv.trycloudflare.com  
....  

Immediately, you will see a random address that cloudflared has generated. It maps to the address http://localhost:11434 on your computer. When accessed from another machine at https://tennis-coordination-korea-wv.trycloudflare.com, we see the same result as accessing http://localhost:11434 on the local machine.

The above is just an example of mapping any port on your machine to the Internet; for Ollama or many other tools, additional configuration for the hostname in the headers is required. In the Ollama documentation, it instructs:

$ cloudflared tunnel --url http://localhost:11434 --http-host-header="localhost:11434"  

After that, try calling the API using the new URL. Note that you must run the llama3.2 model from Ollama beforehand.

curl https://tennis-coordination-korea-wv.trycloudflare.com/api/generate -d '{  
  "model": "llama3.2",  
  "prompt": "Why is the sky blue?"  
}'  

Wonderful! At this point, everything is done, and you have an API endpoint pointing to Ollama on the local server that anyone can access. However, if you have a domain in Cloudflare and want to maintain a fixed address like api-ollama.2coffee.dev, you need to configure it according to the six steps.

Keeping a Fixed Domain

It's very simple; after completing step 4 in the article Adding a "Tunnel Locally" Tool - Bringing Local Servers to the Internet, modify the contents of the config.yml file as follows:

tunnel: <tunnel-uuid>  
credentials-file: path/to/.cloudflared/.json  

ingress:  
  - hostname: api-ollama.2coffee.dev  
    service: http://localhost:11434  
    originRequest:  
      httpHostHeader: "localhost:11434"  
  - service: http_status:404  

Then run:

$ cloudflared tunnel run <tunnel-uuid>  

Although this method can help you create an API address similar to OpenAI's ChatGPT, it has many limitations, such as depending on the machine configuration and the model being used. Ollama can only handle one query at a time, so making continuous or simultaneous requests will not be efficient.

Premium
Hello

The secret stack of Blog

As a developer, are you curious about the technology secrets or the technical debts of this blog? All secrets will be revealed in the article below. What are you waiting for, click now!

As a developer, are you curious about the technology secrets or the technical debts of this blog? All secrets will be revealed in the article below. What are you waiting for, click now!

View all

Subscribe to receive new article notifications

or
* The summary newsletter is sent every 1-2 weeks, cancel anytime.

Comments (1)

Leave a comment...
Avatar
Ẩn danh4 months ago

hay bạn, đúng cái mình đang cần, many thanks

Reply
Avatar
Xuân Hoài Tống3 months ago

Rất vui vì nó giúp ích được cho bạn 🙏