OData for FileMaker Tutorial from A to Z

Introduction This tutorial is your comprehensive, practical guide to OData in FileMaker Server 21.1.x and beyond, designed for engineers who want to understand, implement, and troubleshoot real-world OData integrations with FileMaker. Why OData Matters OData (Open Data Protocol) is now a first-class feature of FileMaker Server, enabling standard, RESTful access to your data for reporting,

Read More

Refactoring the Chaos: A Human-AI Partnership in the Valley of FileMaker

In the Valley of FileMaker, we found complexity born not of malice — but of time. Through layered scripts, value lists, and fragile logic, we unraveled the tangle not with judgment, but with clarity. Together — a human and an AI — we refactored, simplified, and respected what came before. This is our story of rebuilding trust in the native tools of a misunderstood platform.

Read More

How a Tiny Button Saved My Sanity: Clearing FileMaker Pro Cache to Fix Connection Issues

If you’ve ever been caught in a whirlwind of frustration while troubleshooting FileMaker Pro connectivity issues, you’re not alone. Recently, I found myself on the verge of madness (let’s call it “CraZZZZZZZZzzzzzzzzzY”) while trying to resolve a mysterious problem where FileMaker Pro 20 and 21 clients could no longer connect to a FileMaker Server on port 5003. Meanwhile, FileMaker Pro 19 connected flawlessly. The culprit? Cached settings. The hero? A single Delete Cached Temp Files button.

Here’s the full story, along with step-by-step instructions to help you avoid the rabbit hole I fell into.

Read More

Step-by-Step Guide to Implementing AI-Powered Semantic Search in FileMaker 2024

Community Live 13: Jumpstart AI in Claris FileMaker – A step by step workshop Overview of FileMaker’s New AI Features FileMaker 2024 introduces a variety of AI tools, specifically for semantic search, which organizes and retrieves data based on contextual meaning. Here are the key tools: These features lay the groundwork for integrating sophisticated search

Read More

Setting up a Portable Local AI Environment using Llama 3.2 Vision, Docker on Linux Windows Subsystem and FileMaker for Image Recognition

This guide provides a step-by-step approach to setting up a portable AI environment using Docker on Windows Subsystem for Linux (WSL). We’re focusing on creating a flexible setup that allows you to run large language models, such as those available with Ollama, in an offline and secure environment. This setup is particularly useful for organizations or individuals who need to work without direct internet access or who want the flexibility to move their setup between different machines.

Read More