Setting Up Ollama
Introduction
To use embedded AI in Transana, you must first set up a program called Ollama. Ollama provides AI services from a computer you control and, when properly configured, does not retain or share your sensitive, confidential data. Fortunately, the process of setting up Ollama is quite easy.
Download and install Ollama
The first step is to browse to the Ollama Download Page. Click on your operating system, download the file (Windows and macOS) or enter the download command (Linux), and install the program in the same way you install other software on your computer. This installs both the Ollama Server behind the scenes and a simple program for chatting with Ollama AI models outside of Transana.
Configure Ollama
Once Ollama is running, you should configure it. On Windows, you should see the Ollama logo in the Taskbar on the right side, while on macOS, you’ll find the Ollama logo in the menu bar on the right side. Right-clicking this icon gives you a “Settings” option. You can can also get into settings through the Ollama interactive interface through a sidebar option on Windows and the menu on macOS.
Configuration Guidelines
- You do not need to create an Ollama account or sign in.
- We strongly recommmend disabling Cloud mode. This setting tells Ollama to keep all data local and disables using Cloud models which require sending data to external servers.
- Turn the Expose Ollama to the network option off unless you explicitly intend to access Ollama from a different computer or share your Ollama server with other members of your research team.
- You can change the directory for storing Ollama models if you wish. Models can take up significant storage, especially if you download many of them. The settings shown above instruct Ollama to store models (on a Windows computer) on an external hard drive, drive N:, in the “Ollama” directory.
- You can ignore the Context Length setting, as this is managed through Transana
Connect from Transana
You configure Transana’s connection to your Ollama server in the “Explore Data with Embedded AI” tool on the Settings tab. See the Explore Data with Embedded AI page of the Transana Tutorial for more information on how to get to this tab.
- Set the Ollama Host setting to http://localhost:11434. This is the default setting.
Downloading and Installing Models
When you first install Ollama and connect to it through Transana for the first time, you will not have any models installed. There are many different models to choose from within Ollama, and the task of figuring out what model(s) to download and use may seem daunting for beginners. We recommend you read Using AI as a way of familiarizing yourself with this topic. (We tested about 140 models and determined which models work well for text analyses and image analyses. The results are included there.) You may also want to visit the Ollama Models page. A sample model listing from that page for the “ministral-3” model is shown here.
To add a new model to your Ollama server within Transana, press the Add a New Ollama Model button on the Settings tab. Enter the model name carefully, being sure in include both the model name and the “parameters” description you want. For example, if you’ve chosen to try the “ministral-3” model, you must choose between the 3b, 8b, and 14b options. Enter the model “ministral-3:8b” (for example) when asked for the model name. Ollama will not be able to download and install the model without the full specification.
To protect data confidentiality, It is important to avoid using “cloud” or “turbo” Ollama models with Transana, as these models submit data to external servers rather than processing them locally. In the above example, “ministral-3:8b” is processed on your computer, but “ministral-3:8b-cloud” would send data to an external server.
Advanced Ollama
You don’t have to run Ollama on the same computer where you use Transana. You can install Ollama on the fastest computer on your network to improve performance. You can also install Ollama on a shared computer for everyone on your team to share.
To connect to Ollama running on a different computer, you change the Ollama Host setting in Transana. Instead of using host http://localhost:11434, which uses the same computer that is running Transana, you substitute the IP address for another computer that is running Ollama. In my case, I use http://192.168.1.204:11434 to connect to my Windows laptop. Your IP address will, of course, be different.
To maintain data security, you should only connect to Ollama on computers that you control and that are running on your local network. The Ollama configuration guidelines provided above apply to networked computers as well.

