In this detailed post, I explore the innovative AI capabilities of Claris FileMaker 2024 as presented by esteemed FileMaker expert Beverly Voth in her recent PDF. With her extensive experience and numerous accolades, Beverly provides a comprehensive look at how the latest AI features can revolutionize FileMaker development. This analysis is crucial for developers looking to leverage AI to enhance their applications, streamline operations, and unlock creative solutions. For further details, please refer to the original PDF on Claris Community. I am grateful to Beverly Voth for granting permission to post her work in HTML format on this blog.
What is this AI I keep hearing about?
With Claris FileMaker 2024, there are new features that enhance our ability to do AI integrations. But I thought we can already do that?
Yes! With previous features for Core ML and using the script step Insert From URL, we can!
Now we want AI to be easier to use/integrate with Claris FileMaker. This article is to give you a few basics and then suggest ways to use these features. Resources will also be noted at the end of the article.
Artificial Intelligence (AI) has been around for many years. But it seems to be a more popular buzzword in the last few years, especially in the Claris FileMaker Community. We just know that “AI” is a fairly large and complex technology that is ever evolving. We are all learning what it is for us as developers and how to use it now. My dictionary says the term is a noun:
“the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”
So AI is something that may help us automate some of the tasks we do now for developing and designing. We will explore ways that we might do that with some use case ideas!
But this is something different that I have to learn, right?
We know that under the hood Claris tries to make things easy for us. But with the additional new functions, script steps, and other “external” (but integrated) functionality in the last few years, it can be a long-haul to get proficient at developing better solutions. These can be learned, but how can we leverage AI to help us get there faster? What might we divine for using AI to teach us how or help us create? Some of the focus of this article is to help you get going with AI, so you might even use it to help you learn those other technologies!
- LLM = Large Language Model – the storage of and the computational ability to compare data. Can be cloud-based or local, both may require adequate storage space and processing power based on what data is available and what needs to be done with it. Access to these models may be fee-based.
- Tokens = the LLM stores the data as tokens (numbers) to make processing/comparisons faster. These tokens are not necessarily “words”, but are the basis for what you might be charged (per token) to use the LLMs. See Embeddings.
- Prompt/Prompt Engineering = a way to ask, using Natural Language (and/or inclusion of detailed sample code, documents, or images), the AI models for information. These may be brief or provide more detail to narrow the matches to what is being asked. You’ll find many prompt examples in Ronnie Rios’ Claris Engage 2024 demos and in the resources links.
- Open API – (often referred to as a public API) is a publicly available Application Programming Interface that provides developers with programmatic access to a (possibly proprietary) software application or web service. Open APIs are APIs that are published on the internet and for more than just AI.
- API key = Application Programming Interface Key, a registered key with various AI Providers. This may be free, but likely to have some fee based on the number of tokens used. Your key, when added to a script, can be entered at runtime, so that the key is not stored in the database using it. The key will be available throughout your session for that particular provider and the different models they offer. It is set for your database file using the “Configure AI Account” script step, then valid throughout, if the same Account Name is used for successive calls. See script steps, below.
- Embeddings = the numeric tokens that are created and used for semantic comparisons. These are not keyword or word comparisons, but the “meaning” of words or phrases, tokenized (calculated) to provide for “similar” comparisons. See also Vectors.
- ChatGPT = Chat Generative Pre-trained Transformer. Just one model among many and it has several versions. Open AI is the provider.
- ML = Machine Learning.
- Core ML, for example, can be trained to refine the matches of what is being compared.
- DDL = Data Definition Language or Data Description Language – a syntax for creating and modifying database objects such as tables, indices, and users. For example, the SQL usage of actually changing the Schema. A code example is provided at the end of this article
- Vectors – I found this article very helpful to explain: How Vectors in Machine Learning Supply AI Engines with Data A few snippets:
“Artificial intelligence engines need data to learn and operate, but the data you and I find meaningful is foreign to machines. Machines need data translated to their preferred language: math. This conversion happens with the help of vectors.”
“What are vectors in machine learning? Vectors are mathematical representations of data that have both magnitude and direction. We use them to convert data to math that machines can understand, process, and analyze.”
- Cosine Similarity = how similar are two vectors? Don’t stress over the formulas shown in the link! Luckily for us: see the new functions in Claris FileMaker 2024, and see the video & get the materials from Wade Yu’s presentation at Claris Engage 2024 (see links below). Hint: the math is done for us!
- See also Claris documentation for AI terms & features.
- As new possibilities are explored and shared, you’ll discover new terms! Remember to discuss them on the Claris AI topic of the Claris Community forum.
What functions and steps are in Claris FileMaker 2024?
- Notes: Some steps listed below will have a ❋ symbol meaning there are selections to specify in a pop-over dialog. The “___” is where the calculation dialog is used to enter: text, field reference(s), or variable name(s). “{…}” is a list of options in a drop-down.
- All script steps are Server Compatible, except “Configure Machine Learning Model”. The Help topic says “Partial” with the notes: This script step is supported only on iOS, iPadOS, and macOS. Please check your own system.
- Help topics listed below are links to the English version, although Claris Help has other language versions.
Check for possible changes!
1. Script Step: Configure Machine Learning Model [ Operation: {Unload, Vision, General}; From: ModelContainerField ] ❋ Specify Field
- Loads a Core ML (Machine Learning) model and prepares it for use.
- NOTE: this used “Name:” instead of “Operation:” parameter in previous versions
- “From:” is not required when using the “Unload” Operation, otherwise the ❋ appears when you click the “From” & the Specify Field dialog shows.
2. Function: ComputeModel ( modelName ; name1 ; value1 )
- Returns a JSON object containing the result of the Core ML model evaluation.
- For general models: ComputeModel ( modelName ; parameterName1 ; value1 )
- For vision models: ComputeModel ( modelName ; “image” ; value1 ; “confidenceLowerLimit” ; returnAtLeastOne )
3. Function: GetModelAttributes ( modelName )
- Returns metadata in JSON format about a named model that’s currently loaded.
4. Script Step: Insert from URL [ Select; With dialog: {On, Off} ]
- Target: Specify (Field or Variable) ; Specify URL (Specify) ; Verify SSL Certificates (checkbox toggle) ; Specify cURL options (Specify)
- Enters the content from a URL into a field or variable.
- The Insert from URL is used for calling the APIs by the various providers and their models, with the use of cURL options. See each API to define parameters to provide.
- For some great examples of how this is used for AI, see the demo files that go along with Ronnie Rios’ Claris Engage 2024 Session. TIP: check all the scripts, field definitions, value lists, & custom functions on the demos of how these can work.
- Can this step still be used for communicating with AI? Yes, but may be more complex and require using the cURL options to set up. I would not mix using this step with the others. However, if you need more options beyond the new steps in Claris FileMaker 2024, Insert from URL can still be used.
5. Function: GetLiveText ( container ; language )
- Returns the text found in the image of the specified container field using the text-recognition algorithm.
- While not listed as an AI function (it is under the Container functions). Many may think of it as having “AI-like” magic. And like the Configure Machine Learning Model script step, it is only compatible on: macOS, iOS, & iPadOS.
- See the Claris Engineering blog: Working with the GetLiveText function in FileMaker Pro and FileMaker Go
6. Function: GetLiveTextAsJSON ( container ; language )
- Returns the text and position as JSON data for each line of text found in an image in a container field by using the text-recognition algorithm.
- JSON result instead of Text (like the function, above). See the Engineering blog
New in 2024, with descriptions and tips
7. Script Step: Configure Al Account [ Account Name: ___ ; Model Provider: {Open AI, Custom} ; API key: ___ ]
- Sets up an AI account to use by name, given a model provider (or endpoint) and an API key.
- This is going to be the first step before using the other steps or functions. You may name the account anything unique to be used with the other steps/functions in the same database. This name will tie the key in, so you don’t need to re-enter for each step, and it expires when you close the file. If you use Custom as the Model Provider, you will be asked to enter an Endpoint value (URL) before the API Key parameter (even locally hosted!)
- TIPS: create a global storage field or variable with Show Custom Dialog scriptstep (or other means) to ask the user for the Account Name and Key. Then they are not stored in the database and used at run-time. Or you may prefer the Account Name be hard-code, so that’s an option. Just keep in mind that a fee-based key will be charged if used by others!
8. Script Step: Insert Embedding [ Account Name: ___ ; Embedding Model: ___ ; Input Text: ___ ]
- Target: (Specify – the selection of: Field or Variable)
- Enters the vector representation of input text using an embedding mode, into a field or variable.
- Convert Input Text (as returned by the Model) to tokens for use to compare later (with the “Perform Semantic Find” step, for example). Use the same Account Name set up in the “Configure AI Account” step. The Embedding Model is one from the Model Provider set up there, as well. For OpenAI (Provider), use one of these for example: text-embedding-3-small, text-embedding-3-large, or text-embedding-ada-002, for example. The Target is the field or variable where this converted data is stored, and may be stored as binary (container field) or text values. Binary may be faster for comparisons and you can use the functions to convert from one to the other type of field.
- TIPS: These values are all text values, so you may use dialogs to get the options,or Insert Text into fields or variables for use later. The Account Name may be the variable you previously set, for example, but it must match what was set in the “Configure AI Account”. And the Embedding model may be a drop-down list that you allow the user to select, but must be available from the previously selected Model Provider value.
9. Script Step: Insert Embedding in Found Set [ Account Name: ___ ; Embedding Model:___ ]
- Source Field: (Specify) ; Target Field: (Specify) ; Replace target contents (checkbox toggle) ; Parameters: ___
- Gets the vector representation of text in all records of the specified field using an embedding model.
- This uses the found set of records to set/replace the computed embedding (as returned by the Model) with the tokens you will use to compare your prompt with those embeddings.
- TIPS: Like the “Insert Embedding” step, the storage may be binary or text, but is stored in a Field only (no variable choice). Don’t replace the field value if the record already has been processed and you know there is no need to refresh the embedding. Parameters are set up as JSON, see code example below that can be refined for better results. These values might be user-defined for testing better responses, so include methods to select with validation, perhaps.
- Parameters, description, default value– see the help topic for allowed range:
- MacRecPerCall – The maximum records put in a call to generateembeddings, 20 (records)
- MaxRetryPerWorker – The maximum retries before failure, 5 (tries)
- MaxWaitPerRetry – According to OpenAI’s response, the maximum timeto wait before returning an error, 60000 (milliseconds), Open AI only
- TruncateTokenLimit – When TruncateEnabled is on, the number of tokens are truncated to this token number, 8185 (number of tokens to limit),Open AI only
- TruncateEnabled – Truncate text on TruncateTokenLimit before sending to OpenAI, 1 (true), Open AI only
10. Script Step: Perform Semantic Find [ Query by: {Natural language, Vector data} ;Account Name: ___ ; Embedding Model: ___ ; Text: ___ ; Record Set: {All records, Current found set} ]
- Record set {All records, Current found set}; Target field (Specify); Return count (Specify) ; Cosine similarity condition {greater than, less than, equal to, greater than or equal to, less than or equal to} ; Cosine similarity value (Specify)
- Performs a semantic find in the specified field and constrains the specified record set for the given search text and model to use or for the given embedding vectors.
- If you choose the “Query by: Vector data” parameter option (instead of “Natural language”), then only the “Record Set” parameter is required along with specifying the text to query, although other ❋ settings may be used to refine the search. This presumes you have created the embeddings to use on those records. See code examples below for this Script step.
11. Script Step: Set AI Call Logging [ On/Off ]
- Filename (Specify) ; Verbose (checkbox toggle)
- Turns AI call logging on or off and outputs the debug log to the specified file name in the documents folder.
- TIPS: Likely this is a preference, but if “on” the step would be called before anyother steps, except “Configure Al Account”. Not all calls may need to be logged, or you may or may not want everything tracked, but certainly at first to help debug? This is not a global setting and different scripts & steps may have different log files. You specify the file where this is saved and whether to use Verbose logging or not, using the default “LLMDebug.log”, if none is specified. Remember that the default storage is your Documents folder/directory.
12. Function: CosineSimilarity ( v1 ; v2 )
- Returns the similarity between two embedding vectors generated by the Insert Embedding script steps, as a number between -1 (opposite) and 1 (similar). Vectors are not compatible between models.
- Store results in a Field or Variable.
13. Function: GetEmbedding ( account ; model ; text )
- Returns the embedding vector as container data using an embedding model for the specified input.
- Store results in a container field, so use Set Field [], Set Variable [], or similar steps. Use the same account and model as set by the Configure AI Account.
- Function: GetEmbeddingAsFile ( text {; fileNameWithExtension } )
- Converts embedded data in text fields, variables, and calculations to embedded data file. You can specify a filename and extension for the file created from the embedded data text.
- Store results as binary in a container field. Binary searches can be more performant than Text searches. A default filename will be used if you don’t specify.
- Function: GetEmbeddingAsText ( data )
- Converts embedded data file in container fields, variables, and calculations to embedded data text.
- Store in Field or Variable to convert binary to text.
14. Function: GetTableDDL ( tableOccurrenceNames ; ignoreError )
- Returns table information in Data Definition Language (DDL) format for a list of table occurrences specified as a JSON array.
- The results will be a SQL CREATE TABLE text (for every TO that you have in the JSONArray).
- See Code Example, below
- Function: GetTokenCount ( test )
- Returns the token count for the specified text. Use for guidance only; actual counts used by models may vary.
- Remember that tokens are used for charges if there is any fee. Also the count may help you refine the prompt or other settings.
15. Function: Get ( LastStepTokensUsed )
- Returns the tokens used in the last AI script step.
TIPS for AI steps and functions:
- Set up variables and for some of the parameter values that may be used again and again, such as: Account Name, Model, etc. for use in the new Scripts Steps and Functions
- Note: these steps and functions may not be updated on the Help reference pages:
New menu locations for the selection of AI script steps or functions
Artificial Intelligence | Artificial Intelligence |
Configure Al Account [ ] | ComputeModel () |
Configure Machine Learning Model [ ] | CosineSimilarity () |
Insert Embedding [ ] | GetEmbedding () |
Insert Embedding in Found Set [ ] | GetEmbeddingAsFile () |
Perform Semantic Find [ ] | GetEmbeddingAsText () |
Set Al Call Logging [ ] | GetModelAttributes () |
GetTableDDL () | |
GetTokenCount () | |
Get ( LastStepTokensUsed ) | |
See the updated Claris Help for additional information on all these. In previous versions some items may have appeared under the Miscellaneous or other menus.
Complex calculations using current functions
- Help construct a While() statement
- Help create statistical formulas beyond the built-in functions
- Convert Excel formulas to Claris FileMaker calculations
- Create Custom Functions (what is the first Monday of next month, for example)
- Return Holidays (what is the holiday date for this/next year for ___?)
Learning other integration technologies
- Prompts that might provide a sample of the ExecuteSQL() function, the table(s)/field(s), and natural language needs to construct the query
- Prompts to provide the table(s)/fields, and example JavaScript to help construct “views” for the Web Viewer (Data Tables, for example), including the HTML, CSS, & JavaScript needed
- Parsing XML – returning the XPATH to a particular element and/or attribute for a given Schema
- Parsing JSON – return the path needed to a particular element (object or array index) for a given Schema
- Writing XML – using whatever schemas are provided, create an XSLT to map the elements from one to the other (XML output). That could be used for Claris FileMaker Import or Export (Records), for example
- Writing JSON – providing the prompt with the functions needed to write JSON and schema examples, write the calculation for you
- Conversion of “formats”, grammar, or schema: XML <-> JSON <-> CSV
- Help write the cURL text needed for API requests
- Help write the command line text needed for the Claris FileMaker Tools that uses CLI (command line interface) in a Terminal app, for example:
- Help write the OData endpoint for related table queries (OData API requests)
- Analyzing Log files (from Server or Locally)
- You’ll find many Use Case ideas by other developers. What do you think you’d like to do?!
- Many of these ideas/suggestions are demonstrated in the set of materials provided with the Videos on AI from the Claris Engage 2024 conference: “CalcWriter”, “NL search”, “Smart Dashboard”, & “LLM Apps”. Other resources may contain demos using the ideas or have even more.
- Claris AI, Claris Community topic – This space already has content with examples, question & answers, and other resources. Bookmark it, better yet: login and “follow” the topic to get notifications (if you have them set) by email of any new questions. Watch this space grow as we all: learn, test, ask, and share what we have learned.
- Watch for other blog posts and presentations, there is a lot to cover! Share links you find interesting.
- Claris Engage 2024, Videos on AI (playlist)– don’t forget to download all session materials at the start of the videos!
- Delivering Al driven solutions with Claris FileMaker: a practical overview (Ronnie Rios) – includes these demo files: “CalcWriter”, “NL search”, “Smart Dashboard”, & “LLM Apps”
- Let’s chat! chatGPT in your daily routine (Joris Aarts)
- AI in business: Responsible real-world applications (Cris Ippolite)
- AI Foundations for Claris FileMaker Developers: A technical deep-dive (ErnestKoe)
- AI under the hood: Integrating Large Language Models (Wade Ju)
- Exploring The Potential: AI Integration in Claris FileMaker Solutions Webinar – Video: with Ronnie Rios, Cris Ippolite, Ernest Koe, & Joris Aarts
- Using an On-prem LLM for AI Semantic Search in FileMaker 21, blog by Wim Decorte, Soliant – lots of details for more advanced information than can be provided in this article.
- Hugging Face: https://huggingface.co –plan on a day (or two) to explore this awesomewebsite for collections of AI topics.
- ClarisTalk AI –podcasts & videos, with Matt Navarre, Cris Ippolite, & guests (the videolink includes all AI videos from iSolutions)
- The Claris AI Learning Initiative
- “The goal would be to have a community-sourced knowledgebase of modern FileMaker training content text that could be offered for FREE as API endpoints to developers worldwide to integrate FREE FileMaker training to new learners and platform evaluators everywhere. All while leveraging secure, responsible and modern AI to accomplish this.”
- Kick-off Video – read the show notes, including the link to join the Discord space for continued discussion.
- How to Add Natural Language Search to your Claris FileMaker Applications – Acolyte Applications – by Jonathan Nicoletti, Natural Language search as an Add-on
- Video and Demo File for new AI script steps in FM 2024, blog by Mike Wallace – “…going over the comparison of semantic search via Pinecone API and the new FileMaker script steps.”
- Claris Marketplace – you will begin to see free and paid tools to help you with AI and other integration technologies.
- W3Schools AI tutorial:
- https://www.w3schools.com/ai/default.asp
- Check out the Graphic on the home page!
- The menu shows a list of what is covered in this tutorial, including: Machine Learning in JavaScript, Machine Learning Examples, Learning is Looping, ML Terminology, & Machine Learning Data. Not all will apply to Claris FileMaker 2024 features, but use the site for exploring more.
- Prompts/Prompt Engineering
- Community Live Episode 2: Learn about AI prompting basics by Cris Ippolite – Includes the video from this session (30 May 2024). There is also accompanying materials containing additional links about Prompt Engineering.
- chatGPT “Program Quickstart Guide” by Cris Ippolite – “Guides users through a prompt engineering course, focusing on the course script and exercises.”
- Effective Prompts for AI: The Essentials blog – Improve the questions (prompts) to ask to get better answers (results). You may search for other articles, or ask ChatGPT!
- How to Write Good AI Prompts: A Beginner’s Guide (+12 Templates) blog – good examples under different categories!
- “Keep It Simple and Direct: Your AI doesn’t appreciate riddles and treasure hunts. So, the straighter the instruction, the better. Be clear, be concise, and be direct.”
- “Provide Adequate Context: Context is like handing your AI a detailed map to the treasure chest. A prompt sans context is like asking Da Vinci to paint a portrait blindfolded.”
- “Single Tasks Only, Please: Overloaded prompts can overstimulate and confuse your AI—it’s not built to multitask.”
- Just read the article!
- xkcd – AI in the comics
- xkcd main site: https://xkcd.com/
- xkcd explained: https://www.explainxkcd.com/wiki/index.php/Main_Page
- xkcdfinder: https://xkcdfinder.com/
- Search for ‘AI’ or use:
- Artificial Intelligence category list of comics
- xkcd’s own AI – How does the search work? “Data about each comic, such as the title, transcript, explanation, etc. are gathered from xkcd.com and explainxkcd.com. This data is converted in OpenAI’s embeddings, and then added to a vector database. When you make a search, your text is also converted to an embedding, which is then compared to each embedding in the database to find similar intent.”
- See the source code on GitHub: https://github.com/KDJDEV/xkcdfinder
- AI Providers (not a full list) – check each site for pricing & what models they provide
- Open AI (https://openai.com/) – most common, including ChatGPT
- New! Introducing GPT-4o and more tools to ChatGPT free users
- Anthropic (https://www.anthropic.com)
- Cohere (https://cohere.com/)
- LM Studio (https://lmstudio.ai/) – for local LLM information
- You’ll find other’s getting in on providing LLMs and AI integration.
- The Missing FM 12 ExecuteSQL Reference – by Beverly Voth, get the PDF and example files to help you understand the SQL. But let the AI create the SQL for you!
- ChatGPT for FileMaker: Practical Applications – Soliant – By Karl Jreijiri, “… As we continue to explore and push the boundaries of what AI can do for FileMaker and other applications, it’s crucial to stay abreast of the latest advancements and use cases. The examples and use cases shared in this blog post are just the tip of … “
- The UX Designer’s Role in a World of AI – podcast by Alexis Allen and Matt O’Dell
- [WSJ] Apple Is Developing AI Chips for Data Centers, Seeking Edge in Arms Race (subscription required to read – search for many articles on Apple and AI!)
- Apple silently releases its Deep Learning framework as open-source code – article by Dennis Sellers, December 6, 2023, “Apple is silently releasing its Deep Learning framework as an open-source code… The new MLX framework runs natively on Apple Silicon with a single pip install.”
- FileMaker ChatGPT Integration – blog post by Cath Kirkland, 19 March 2024, “Simple integration using ChatGPT and FileMaker Pro solving two business problems.” There is an example file to download and this article uses the still-valid Insert From URL script step. The details for the cURL are a good comparison with using the newer script steps in Claris FileMaker 2024.
- Working with LLMs in FileMaker 2024, Claris Engineering blog – including these topics: LLM terms, FileMaker LLM tools, LLM options, Configure LLM, Create and store vector embeddings, Perform semantic searches, Example: Meeting search with OpenAI, Appendix: Collect cosine similarity, Appendix: Collect LLM token usage, Appendix: Debug LLM
1. GetTableDDL ( tableOccurrenceNames; ignoreError ) 𝑖𝑒: [ “𝑇𝑂1_𝑚𝑦𝑠𝑡𝑢𝑓𝑓”, ”𝑇𝑂2_𝑦𝑜𝑢𝑟𝑠𝑡𝑢𝑓𝑓”, … ]
- TIP: use the new function also in Claris FileMaker 2024: JSONMakeArray ( listOfValues ; separator ; type ). You can point it to TableNames (“”) as the “listOfValues” if you want all Table Occurrences in the current file, or use the LIst() function to provide specific TOs to return. The separator would be Char(13)– the carriage return, used with many “list” functions. But you may have a comma or tab separated list, just use the appropriate Char(). The “type” would be quoted texts, so use the reserved word JSONString (or numeric equivalent, 1).
𝐽𝑆𝑂𝑁𝑀𝑎𝑘𝑒𝐴𝑟𝑟𝑎𝑦 ( 𝑇𝑎𝑏𝑙𝑒𝑁𝑎𝑚𝑒𝑠 ( “” ) ; 𝐶ℎ𝑎𝑟(13) ; 𝐽𝑆𝑂𝑁𝑆𝑡𝑟𝑖𝑛𝑔 )
- This returns a list of Table Occurrences in the current file.
- If you want to specify only one TO, you may use one quoted value for the list (no need to add the carriage return): “TableOCC”, but use the Char(13) as the delimiter:
𝐽𝑆𝑂𝑁𝑀𝑎𝑘𝑒𝐴𝑟𝑟𝑎𝑦 ( “𝑇𝑎𝑏𝑙𝑒𝑂𝐶𝐶” ; 𝐶ℎ𝑎𝑟(13) ; 𝐽𝑆𝑂𝑁𝑆𝑡𝑟𝑖𝑛𝑔 )
- Combine this function with the GetTableDDL() to return a SQL dump-like result
𝐺𝑒𝑡𝑇𝑎𝑏𝑙𝑒𝐷𝐷𝐿 ( 𝐽𝑆𝑂𝑁𝑀𝑎𝑘𝑒𝐴𝑟𝑟𝑎𝑦 ( “𝑇𝑎𝑏𝑙𝑒𝑂𝐶𝐶” ; 𝐶ℎ𝑎𝑟(13) ; 𝐽𝑆𝑂𝑁𝑆𝑡𝑟𝑖𝑛𝑔 ) ; 1 )
- Sample results. You might research at how SQL creates tables with this syntax:
CREATE TABLE “TableOCC” (
“ID_PK” (varchar(255), /* any comment / “created_TS” datetime, / record creation timestamp */
“class_ID” varchar(255),
“title” varchar(255),
PRIMARY KEY (ID_pk)
);
2. Parameters for Insert Embeddings in Found Set step, JSON
- Remember that you only need to provide parameters when you want to override any of the defaults (see above).
{“𝑀𝑎𝑥𝑅𝑒𝑐𝑃𝑒𝑟𝐶𝑎𝑙𝑙”: 20, “𝑀𝑎𝑥𝑅𝑒𝑡𝑟𝑦𝑃𝑒𝑟𝑊𝑜𝑟𝑘𝑒𝑟”: 5, “𝑀𝑎𝑥𝑊𝑎𝑖𝑡𝑃𝑒𝑟𝑅𝑒𝑡𝑟𝑦”: 60000,
“𝑇𝑟𝑢𝑛𝑐𝑎𝑡𝑒𝑇𝑜𝑘𝑒𝑛𝐿𝑖𝑚𝑖𝑡”: 8185, “𝑇𝑟𝑢𝑛𝑐𝑎𝑡𝑒𝐸𝑛𝑎𝑏𝑙𝑒𝑑”: 1}
JSONSetElement( “”;
[“MaxRecPerCall”; 20; JSONNumber];
[“MaxRetryPerWorker”; 5; JSONNumber];
[“MaxWaitPerRetry”; 600000; JSONNumber];
[“TruncateTokenLimit”; 8185; JSONNumber];
[“TruncateEnabled”; 1; JSONNumber];
)
- Note: these are as printed from the Script Workspace:
# using Query by: Natural language:
Perform Semantic Find [ Account Name:”myAIsearch”; Embedding
Model:”text-embedding-3-large”; Text: “abc”; All records; NewTable::myTarget; Return
count: 5; Condition: greater than; Value:5 ]
# using Query by: Vector
Perform Semantic Find [ Vector data:”abc”; All records; NewTable::myTarget ]
# note: no Account or Model required, and other parameters are optional
Beverly Voth1 started using FileMaker Pro professionally with version 2.1, but has a floppy disc for use on a Lisa/Mac before Claris purchased the product. She was a full-service hosting provider, full-stack web developer, SQL db administrator, but is semi-retired now and serving as volunteer Lead Facilitator for WITfm . You’ll find her articles and blog posts in the Claris FileMaker Communities.
- FileMaker Booth, JAN 2000, San Francisco MacWorld: ThemeCreator™, XML-based theme editor, co-creator
- Filemaker DevCon 2001, Orlando DevCon, Speaker: TEC312, “5 Ways to Publish FileMaker Data to the Web”
- FileMaker DevCon 2002, Palm Desert DevCon, Speaker: PRE202, “Using XML with FileMaker Pro”, TEC307, “CDML vs. XML”, & TEC309, “Creating XSL Stylesheets with FileMaker Pro”
- FileMaker® Pro 6 Developer’s Guide to XML/XSL, book 2003, Author
- FileMaker Excellence Award 2003, Phoenix DevCon: FileMaker Excellence Award for Outstanding Contribution to the FileMaker Web Publishing Community
- The Missing FM 12 ExecuteSQL Reference, blog 2012, Author
- FileMaker 12 In Depth, book by Jesse Feiler, 2012, Tech Editor
- Claris Excellence Award 2020, Virtual Engage: 2020 Community Leadership: Leader of the Year
Acknowledgements: Kudos to: the Claris Team and the other AI “cooks” that helped me on this journey!
- (long o: rhymes with both, not moth)