Inscrivez-vous maintenant pour un meilleur devis personnalisé!

Nouvelles chaudes

I tried Sanctum's local AI app, and it's exactly what I needed to keep my data private

Feb, 04, 2025 Hi-network.com

Locally installed AI is the way to go, especially if privacy is important to you. Instead of sending your queries to a third party, you can keep them private, so no one else has access to your questions or the generated answers. When you run a query with the locally installed Sanctum, your data is encrypted, secure, and never leaves the app.

Also: How I made Perplexity AI the default search engine in my browser (and why you should too)

I've been using locally installed AI for a while now (mostly Ollama with the addition of the Msty front-end) and have found it to be quite useful.

But why Sanctum?

  • It's local
  • Data remains private
  • Thousands of GGUF models on Hugging Face
  • PDF summaries
  • Work even in an internet outage
  • It's open-source
  • You can choose your LLM (from Gemma, Llama, Mistral, and more)
  • You can choose if any information is shared
  • Available prompt templates
  • Real-time information on system resources in use

Sanctum could easily become instrumental for research on any given subject, especially when you don't want your queries to go beyond your local machine. No matter what you're researching, Sanctum can help.

Also: How to install Perplexity AI's app on Linux (I found an easier way)

Let me walk you through the process of getting Sanctum up and running. It's quite easy.

How to install Sanctum

What you'll need: The only things you'll need are either a MacOS or Windows computer (Linux version coming soon) and a network connection. I'll demonstrate the installation on MacOS. If you're using a Windows computer, the installation is as simple as installing any other application.

1. Download the installer

The first thing to be done is downloading the installer. For that, head to the Sanctum site and click the Download drop-down. Select your OS and the download will start. 

Show more

2. Install the app

Once the download has finished, locate and double-click the file in Finder. A new window will pop up, asking you to drag the Sanctum icon to Applications. Do that, and the installation is done. You can then eject the Sanctum drive on your desktop and delete the download.

Show more

Installing Sanctum on MacOS is very easy.

Jack Wallen

How to set up Sanctum

It's now time to configure Sanctum.

1. Save your recovery phrase

As soon as you start Sanctum, click the Get Started button on the main window. In the resulting window, you'll need to copy and paste your recovery phrase. This is important because if you lose your login credentials, you'll need it for recovery.

You can either just copy the phrase or reveal it and then copy it.

Jack Wallen

2. Set a password

In the next window, you are required to create a password. Make sure this password is strong and unique. When you've done that, click Continue.

Make sure to use a strong and unique password.

Jack Wallen

3. Download an AI model

You can now decide which LLM to download and use. You can select others later, but you'll want to select one here. After making your selection (I opted for Mistral), click Continue.

If you decide to change your LLM later, that's possible from within the Sanctum app.

Jack Wallen

4. Select your privacy level

You can now select the level of privacy you want for Sanctum. I selected "Don't share a single byte," and I suggest you make the same choice. Once you've done that, click Continue.

I would highly recommend you don't send a single byte.

Jack Wallen

At this point, you can start using Sanctum as a locally installed AI tool to assist you with all (or some) of your research.

Artificial Intelligence

The best AI for coding in 2025 (and what not to use - including DeepSeek R1)I tested DeepSeek's R1 and V3 coding skills - and we're not all doomed (yet)How to remove Copilot from your Microsoft 365 planHow to install an LLM on MacOS (and why you should)
  • The best AI for coding in 2025 (and what not to use - including DeepSeek R1)
  • I tested DeepSeek's R1 and V3 coding skills - and we're not all doomed (yet)
  • How to remove Copilot from your Microsoft 365 plan
  • How to install an LLM on MacOS (and why you should)

tag-icon Tags chauds: Intelligence artificielle Innovation et Innovation

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.