🧠 Run Obsidian with FastFlowLM β€” A Faster, Smarter Second Brain

This guide walks you through setting up Obsidian (with AI Providers and Local GPT plug-ins) and FastFlowLM (FLM) to work seamlessly together.


πŸ“‘ Prerequisites

Before starting, ensure you have the following installed:


πŸš€ Quick Start

Install plugins in Obsidian

  1. Open your Obsidian vault.
  2. If the sidebar is collapsed, click Expand in the top-left corner.
  3. In the bottom-right corner of the sidebar, click βš™οΈ to open the setting panel.
  4. Under the Options section (left), click Community Plugins .
  5. Click Browse, then search for AI Providers.
  6. Install and enable AI Providers.
  7. Repeat the same steps to install and enable Local GPT.

Start FLM server

Run the following command in your powershell to launch the FastFlowLM server:

flm serve llama3.2:1b

Configure AI Providers

Option A: OpenAI API

  1. Click βš™οΈ to open the setting panel.
  2. Under Community Plugins section (left), clik AI Providers.
  3. Click βž• to create a new provider.
  4. Set Provider type to OpenAI.
  5. Enter the Provider URL: http://localhost:52625/v1.
  6. Enter a Provider name (e.g., FLM_OpenAI).
  7. Enter any API key (e.g., flm).
  8. On the Model line, click the πŸ”„ icon to load available models, then select your preferred modle (e.g. llama3.2:1b).
  9. Click Save.

Option B: OpenAI API

  1. Click βš™οΈ to open the setting panel.
  2. Under Community Plugins section (left), click AI Providers.
  3. Click βž• to create a new provider.
  4. Set Provider type to Ollama.
  5. Enter the Provider URL: http://localhost:52625.
  6. Enter a Provider name (e.g., FLM_Ollama).
  7. Enter any API key (e.g., flm).
  8. On the Model line, click the πŸ”„ icon to load available models, then select your preferred one (e.g. llama3.2:1b).
  9. Click Save to apply your AI Provider.

Configure Local GPT

Pick models

  1. Click βš™οΈ to open the setting panel.
  2. Under Community Plugins section (left), click Local GPT.
  3. Set Main AI Provider to the provider you configured earlier (e.g., FLM_OpenAI ~ llama3.2:1b).
  4. Review the Action list to see what Loca GPT can do.

Set up hotkeys

  1. On setting panel (Click βš™οΈ to open the setting panel if you are not on setting panel).
  2. Under the Options section (left), click Community Plugins.
  3. Under Installed Plugins (right), find Local GPT row, then click βŠ• to add Hotkeys.
  4. For each action, click βŠ• on the right side and assign a hotkey.

For example:

  • Alt + H β†’ General Help
  • Alt + C β†’ Continue Writing
  • Alt + S β†’ Summarize
  • Alt + G β†’ Fix Spelling and Grammar
  • Alt + F β†’ Find Action Items

πŸ’‘ Start Creating Smarter

πŸŽ‰ Ready to roll… let’s give it a try!

✨ General Help

πŸ“Œ Scenario: You wrote some short notes in Obsidian, but they look too simple.
πŸ‘‰ Highlight your notes, press the hotkey you set for General Help (e.g., Alt + H), and the LLM will expand them into a full explanation.

✍️ Continue Writing

πŸ“Œ Scenario: You started a paragraph but don’t know how to continue.
πŸ‘‰ Highlight the unfinished text, press your Continue Writing hotkey (e.g., Alt + C), and the LLM will keep writing naturally for you.

πŸ” Summarize

πŸ“Œ Scenario: You have a long article or meeting notes that take too much time to read.
πŸ‘‰ Highlight the long text, press your Summarize hotkey (e.g., Alt + S), and the LLM will give you the key points in a short summary.

πŸ“˜ Fix Spelling and Grammar

πŸ“Œ Scenario: You typed something fast and it’s full of mistakes.
πŸ‘‰ Highlight your text, press your Fix Spelling and Grammar hotkey (e.g., Alt + G), and the AI will clean it up and make it easy to read.