Jump to content

Welcome the ORIGINAL FileMaker Community

Take a moment to join us, no noise, all FileMaker...We Promise

Blogs

 

Happy Holidays & End Of Year Sale

We at Geist Interactive, Barbara, Dave, Lance, Jeremy and Todd (and the ghosts) wish everyone a happy holiday. No matter your celebration plans this season, we wish it to be full of warmth and love and joy. Thank you for coming along with us as we continue to push what is possible in FileMaker and […] The post Happy Holidays & End Of Year Sale appeared first on Geist Interactive.
View the full article

todd geist

todd geist

 

Multi-Column Portals

A colleague recently posed this challenge: is it possible to show two columns in a portal, such that the first row displays records 1 and 2, the second row displays records 3 and 4, etc.? With FileMaker, the answer is usually “Yes!” The Challenge Here are some clarifications of the challenge: The records must be […]
View the full article

Beezwax Buzz

Beezwax Buzz

 

What to Look for in a CRM

A CRM is more than a way to track sales. It allows you to see what types of leads are converting to customers, uncover what activities are driving sales, automate tasks, provide better customer service, and streamline your business processes to boost profitability. A CRM is the hub of your business, the tool that [...] The post What to Look for in a CRM appeared first on The Scarpetta Group, Inc..
View the full article

Joe Scarpetta

Joe Scarpetta

 

The FileMaker Filtering JSON function

Our JSONAdditions.fmp12 file provides additional useful custom functions. One of them is a FileMaker filtering JSON function: JSON.FilterByExpression The post The FileMaker Filtering JSON function appeared first on Geist Interactive.
View the full article

todd geist

todd geist

 

Faux Subsummaries with Variable Height Rows

Recently we’ve looked at two methods to generate a “faux” subsummary to address a shortcoming of FileMaker native subsummaries… namely that in a multipage report you can have orphaned entries at the top of a given page with no indication of what parent entity they belong to. The methods were documented here: Conditional Summary Report […]
View the full article

Kevin Frank

Kevin Frank

 

Elapsed Time Display in a Web Viewer

A simple approach, easy to integrate into other FileMaker solutions. A Web Viewer is being sent an HTML page with a Javascript timer displaying the elapsed time. Using the same FileMaker Start Timer script, a Start Time field is set at the same start time and then ended when clicking a FileMaker End Timer button/script.
View the full article

Douglas Alder

Douglas Alder

 

Introducing Admin FM – the FileMaker Admin API for Zapier

Introducing Admin FM – the Zapier integration for accessing the FileMaker ® Admin API in an easy, customizable way. Built using Node.js, this version of Admin FM is currently a beta version, but it allows you to access most of the functionality that the FileMaker Admin API allows. What is Zapier? Let’s take a moment […]
View the full article

Beezwax Buzz

Beezwax Buzz

 

Mail Merge with FileMaker

Mail Merge with FileMaker 493.4 million pieces of mail are delivered every day by the US Postal Service. If you currently hand type your business’s envelopes, letters, and other complex documents like proposals and legal forms, you would greatly benefit from using mail merge. Mail merging, the process of filling in a template document with records from a data source, saves countless hours of double data entry and reduces human error; instead of typing out each recipient’s information, Microsoft Word (or another similar word processor) automatically does all the work for you. In addition it is possible to create mail merges with records from your FileMaker system to save even more time.  Basics of Mail Merge To create a mail merge, you need two things: a template and a data source. In this example, we will be using Microsoft Word to create a template, and FileMaker as the data source. If you do not have Microsoft Word, there are plenty of other free word processors out there that are able to perform mail merges. Exporting FileMaker Records In order to prepare your FileMaker records for mail merging, you need to export them as a .mer file. To export a merge file: In your FileMaker system, get into the found set of records you would like to merge with your template. Click “File” > “Export Records,” change the file type to “Merge,” and choose where you want to save your file.
In the “Specify Field Order for Export” dialog, chose the fields you want to be able to merge in your template file and click export. Your mail merge data source can be other file types, but a .mer file is certainly the easiest to create from FileMaker. Creating Your Mail Merge Template Because mail merging has been around for a while, many people will already have their templates ready. In case you don’t, however, the process is very straight forward: Create a document with your preferred word processor. Choose the .mer file that you exported earlier as your data source. To do this in Microsoft Word 2016, click “Mailings” > “Select Recipients” > “Use an Existing List” and find your .mer file. Place fields in the template where necessary. In Word, click “Mailings” > “Insert Merge Field” and select the field you wish to place in the document. If you wish to view what your filled out template looks like in Word, click “Mailings” > “Preview Results”. Now your template is all set for merging! You can reuse this template for later mail merges. The image below shows what your Word document will look like before and after the mail merge respectively. Things To Note There are more ways to make the process even easier! We have created a FileMaker file that allows users to export contact records and a mail merge template with a few simple clicks. If you change the name of a field in FileMaker that is also used in your mail merge, you will need to update the corresponding variable in your template file. If your solution requires more automation, you can use FileMaker plugins like 360Works Scribe or MonkeyBread Software to automatically generate documents without the need for exporting records. You can also use AppleScript instead of plugins to automatically generate documents out of FileMaker, but you will still need to export the .mer file and have AppleScript automatically run the mail merge in Word or a similar word processor. Conclusion Using mail merge with FileMaker is a highly efficient technique at generating documents as simple as letters to as complex as proposals or legal forms. Feel free to contact us if you would like assistance generating documents from your FileMaker system! Download Mail Merge with FileMaker Database Please complete the form below to download your FREE FileMaker database file. Name* First Last Company Phone* Email* Deployment Assistance?Please contact us to assist integrating into your FileMaker Database. Yes Terms of Use I agree OPT-IN: I agree that I am downloading a completely free FileMaker application file with no strings attached. This database is unlocked, and I may use it for my business or organization as I see fit. Because I am downloading a free database, I agree that I should receive occasional marketing. I understand that I can OPT-OUT of these emails at anytime.
View the full article

dbservices

dbservices

 

How to Avoid Common Pain Points in Custom Software Development

Many companies rely on out of the box software to run all, or at least many parts, of their business. As companies evolve, however, this software often cannot meet their needs to streamline workflows and improve productivity. Their complex business rules simply outpace their native out of the box software. They need something more robust to handle their workflows. Custom software solutions are the obvious next step, but companies may pause at this option because of pain points that arise when creating or using custom software. These pain points can come from different sides of a project  -- the project team or the business team. Both share the burden of challenges related to implementing custom software. Pain Point 1: Unclear Communication As a developer, my point of view primarily stems from the standpoint of the project team. Typically, we developers are not great communicators. This can be a significant struggle when working on large development teams that must communicate effectively during development. The struggle becomes magnified if your team is spread throughout many different time zones or includes offshore developers. Solution: Work with Experienced Project Managers Having a project manager familiar with the primary technology platform your project implements keeps lines of communication open and developers on task. Even better – ensure your project manager has a good understanding of the technical conversations that can occur amongst developers in the project team. If a project manager has a good technical understanding, they can provide key input to find solutions quickly and efficiently during squabbles between developers on how a project should unfold. A knowledgeable project manager becomes even more of a necessity if the project is large and complex. These projects can create many headaches for everyone involved. As requirements are gathered and handed to the development team for implementation, the hope is a direct translation of requirements to logic. However, as project requirements expand and the project gains complexity, the singular list of project requirements can become too intricate for the developers to consume. Instead, the teams should break down requirements into several smaller, more manageable stories upon which to expand. Breaking requirement lists into consumable stories allow developers to work on features simultaneously and rapidly release features. Pain Point 2: Perceived Cost Many businesses hear “custom software solution” and immediately worry about cost. Custom software can range from tens of thousands to hundreds of thousands of dollars and can take anywhere from six months to multiple years to be fully implemented. Solution: Tighten Up Estimation Early A good ballpark estimate starts with a requirements-gathering phase and provides a range of best to worst case scenarios and therefore effort expected. All stakeholders, or key decision makers, should meet and determine necessary features during this stage. If the ballpark estimate exceeds the desired project budget, stakeholders can review the defined requirements and identify features critical to a minimal viable product and those to implement at a later date. Making these decisions about requirements before the project moves to the development phase ensures the project stays on time and budget. Pain Point 3: Perceived Project Length Businesses also worry about how long they’ll have to wait for a finished implementation. Custom projects, no matter the industry, can stretch out over months. When companies decide they need a custom solution, they often need it right then and there and don’t want to wait months to implement it. Unfortunately, shooting for the low end of the project estimate while also trying to meet deadline commitments is easier said than done. As the development phase progresses, requirements often expand. As new features make their way into requirements, this “scope creep” inevitably pushes the estimated delivery timeline. Many of these issues stem from common perceptions about software development. For example, what may seem like a simple addition to project stakeholders can be a significant coding effort for developers. Solution: Involve All Stakeholders To eliminate scope creep and ensure critical features are included during the initial project requirements phase, involve all stakeholders from end-to-end of development. They have the best knowledge of how the process works and what they’re looking for. Don’t forget about end users! Key decision makers may call the shots for project details, but the end users will vocalize their current struggles in the solution itself. They can help solidify the validity of the requirements defined by key decision makers. These team members can also expose any disconnect between back and front-end consumption of the product. With their involvement, you can better plan for features that would typically require scope additions during development. This allows you to build a more complete scope of the deliverable project from the beginning. Their insight increases the accuracy of your project and timeline estimates and benefits everyone involved. Investing in Custom Software A custom software solution helps address many business needs that an out of the box software solution can’t. The investment requires a great deal of communication among everyone involved and discipline to stay true to set requirements. Having a project manager streamline conversations between the project team and the stakeholders dissolves communication issues inherent with complex systems. Identifying well-defined requirements gathered from all stakeholders prevents scope creep, project delays, and budget overages. Don’t let these common pain points hold you back. Once you put these processes in place, you can tackle increasing efficiency in your business with custom software. Have questions? Our team may be able to answer them and provide direction. You can start the conversation here. The post How to Avoid Common Pain Points in Custom Software Development appeared first on Soliant Consulting.
View the full article

Soliant Consulting

Soliant Consulting

 

Advent of Code 2018: Innovation for fun

What's more fun than solving problems in FileMaker? Well hop on board Advent of Code 2018 as we innovate for fun. Read here to learn more about it. The post Advent of Code 2018: Innovation for fun appeared first on Geist Interactive.
View the full article

todd geist

todd geist

 

4 Ways You Can Streamline Business Processes with FileMaker

Your company’s profitability is tied to your efficiency. The more productive your business is, the more you’ll be able to do with the same amount of resources. Streamlining business processes is something that all successful companies are focused on right now. Thanks to advances in technology, companies can automate and reduce costs, simultaneously freeing [...] The post 4 Ways You Can Streamline Business Processes with FileMaker appeared first on The Scarpetta Group, Inc..
View the full article

Joe Scarpetta

Joe Scarpetta

 

bBox for FileMaker v0.93 Now Available

We are pleased to release bBox version 0.93. bBox is a free utility plug-in to extend FileMaker solutions to easily use code libraries and macOS-based functions from Python, JavaScript, PHP, Ruby, AppleScript, Bash/sh, XPath, and SQLite. Included is a demo file that has over 210 examples of how you can put bBox functions to work for you. Bug Fixes: Fixed for 0.93 is […]
View the full article

Beezwax Buzz

Beezwax Buzz

 

Salesforce Einstein: Machine Learning for the Masses

More than the Robot Revolution Even as we approach 2019, AI still feels like a foreign, futuristic ideal to many. We think of robots and reference movies like The Terminator, Blade Runner, or Ex Machina. But AI is here. You can use it right now. It’s accessible, and it’s powerful. Businesses, specifically, have unique opportunities to use their existing technologies with AI capabilities to make smarter decisions. They can leverage the information at their fingertips through the power of data science and smart algorithms ready to do the work for them. While many existing technology platforms have dipped their toes into this space, few have aggressively pursued it to put it in the hands of the masses like Salesforce. What is Salesforce Einstein? In 2016, Salesforce unveiled Einstein, a suite of tools that allow the platform’s users to put their data to work and automate workflows for them. The tools capitalize your information already within Salesforce, saving you time while making your more productive and efficient on the platform. For example, Einstein used in Sales Cloud empowers sales reps to focus on their highest-quality leads through automated lead prioritization. The tool analyzes each lead on a regular basis and reranks them accordingly, ensuring each sales professional is always making the best use of his or her day. No Data Science Experience Necessary Most AI requires data scientists to build these models – tell machines how to look at data. Salesforce is cutting out the middleman and tedious process of building a model. It leverages the data you already have in the platform. Salesforce utilizes machine learning to make this work. It learns from your behavior, creates data rankings itself, and then sorts and prioritizes your data accordingly, adapting to your needs. As a result, the more data you put into the system, the better Einstein’s models and insights get. Available Across the Platform Expanding far beyond just the Sales Cloud, Einstein features are available in Service Cloud, App Cloud, Community Cloud, Marketing Cloud, and Analytics Cloud too. Many features are baked right into existing functionality and require no additional investment into the platform. You can just log in and get data insights. Leveraging Einstein Instead of feeling overwhelmed by the data you have available to mine, it’s time to feel empowered by it. Einstein can dig through all of it for you (at a much faster pace) and provide feedback on how to move forward. AI makes data useful and can shed light on areas you haven’t yet had time to explore – customer journeys, personas, and more. In Sales Cloud, Einstein tracks sales activities and prospect responses to prioritize contacts and opportunities. You can automate your follow-up activities to reach more leads more often. Einstein even delivers news about top accounts, empowering you to know more about critical shifts in their business. In Service Cloud, Einstein provides insights on the best way to manage customer service issues, from case management to scheduling responses. The AI also connects the dots on common problems, getting to the root of a bigger issue and enabling your team to build a solution faster. In Community Cloud, Einstein monitors your users to determine the ideal experience and then helps you scale that experience for all other users. In most cases, personalization plays a huge part in this, and with a huge database of users, this is impossible without AI. Using machine learning, you can deliver customized experiences to every member of your community. In Marketing Cloud, Einstein makes personalized messaging possible across all channels for each target audience. You can deliver a strategic and highly-customized customer experience through customer journeys made possible with AI. Personalized marketing is difficult to scale, but machine learning makes it easy. Einstein Lightning Integrations For those of you have made the switch from Salesforce Classic to Lightning, I have good news – Einstein works even better with Lightning features. The new UI unlocks a variety of features, further emphasizing Salesforce’s hints at a future required migration to Lightning. Customizing Einstein Need to leverage additional data outside of Salesforce within Einstein’s capabilities? The platform plugs into other APIs to not just connect your data but also your strategy and insights. This makes you smarter and streamlines your paths to success. You can even use the capabilities of Einstein outside of Salesforce by plugging it into to external apps and leveraging the functionality within other systems. Einstein’s REST APIs does the heavy lifting for you. The Caveats of Einstein There’s one thing I’ve got to stress – any AI, Einstein included, is only as good as its data. In other words, good data management is everything. “Garbage In, Garbage Out,” as they say. It’s also important to remember that you can only manage what you can measure. Einstein can’t evaluate information that you haven’t quantified. You can often assign a ranking or number system to your data, and I recommend you do so where you’ll need insights before you launch Einstein. Launch Einstein Today. Get Insights Tomorrow. Our team of business analysts and Salesforce architects can take a look at your implementation to uncover new ways to leverage Einstein. Contact us to speak with an experienced team member today. The post Salesforce Einstein: Machine Learning for the Masses appeared first on Soliant Consulting.
View the full article
 

Like a Boss: The FileMaker Join table

As we solve our clients' problems, we might encounter complex data structure needs. A FileMaker join table is a technique to solve some of those issues. Let's take a look at a FileMaker Join Table. The post Like a Boss: The FileMaker Join table appeared first on Geist Interactive.
View the full article

todd geist

todd geist

 

Project Estimation Best Practices: A Look at Our Three-Point Estimator

One of the biggest risks in undertaking a big software development project is underestimating the time and effort involved from start to finish. Our team has provided estimates for hundreds of projects, and sometimes we’ve missed the mark, especially in the early days of our business. Over the years, we’ve put together a strong model for estimating how much time and effort a project is likely to require; our employees undergo annual training on the method. Using these methods ensures we present our clients with the most useful estimates possible and effectively set expectations for each project. Not Just Simple Math When you think of hours estimation, you probably imagine a small team of people putting together a list of tasks and then assigning hours to each. From there, it’s just simple addition, right? From the outside, I’m sure that’s what the estimation process seems like. Behind the scenes, however, things work much differently. Building Uncertainty Ranges No two projects are exactly the same. Even if you feel like you’re developing the same solution, there are too many variables at play to borrow from another experience to build your estimate. Comparing one project to another can be good for getting an order-of-magnitude or “ballpark”-level estimate. However, you wouldn’t want to make firm commitments based on just a rough comparison. Adding Non-Development Factors Not all of the effort in a project is spent by developers. Our project management team works hard to ensure communication remains open and transparent and keeps projects running smoothly. They need time to accomplish their tasks. Especially going into a new client relationship, no one knows how much time and effort that will take. Some of our clients appreciate a daily update email and have few questions. Others like detailed explanations on how things work behind the scenes. Projects also vary widely in how much support they need; for small projects with a single developer, a project manager may stay a little behind the scenes, making sure the developer has what she needs and sending periodic status updates. In larger projects with multiple developers, multiple client contacts, and a multi-step testing process, the project manager is heavily involved every day. He or she leads scrums and meetings, facilitates communication and monitors overall project risks. Besides project management, an estimate should include time for meetings, both internal and external, for quality assurance activities and for work necessary for deployment, to name some of the major non-coding categories. An estimate that fails to budget time for activities like testing, internal and external communication, and project management, may only come up slightly short for a smaller project. However, it will leave a major gap for a large effort. Foundation Phase Estimates If a project requires up-front design and discovery work (and most do), you also need to account for business analysis time. During our Foundation phase activities (also known in the industry as “discovery and design”), our business analysts work with clients to uncover their biggest challenges, opportunities, and goals and then build a plan to address the highest-value features and capabilities during solution development. As we wrap up design and discovery, our developers build a more accurate estimate of hours based on their lessons learned and the emerging solution blueprint. (In fact, we often provide mid-foundation in addition to post-foundation estimates to help put our clients at ease and promote the transparency required for successful projects. Each one gets more and more accurate and therefore encourages better client communication.) Consider Task Size When estimating items, smaller is definitely better. Most of us can’t reliably estimate pieces of effort that are bigger than a day or two in size. If your estimate has an item that says “manage invoicing, 20-50 hours,” you’re going to be well-served by breaking that into smaller units of functionality. This can include “Create Invoice,” “Print Invoice,” “View List of Invoices,” and so forth. If the project has several types of invoice, each with its own print format, then you can probably break “Print Invoice” into smaller items, one for each invoice type. If it’s too soon to break that item maybe it’s too soon to estimate it; or, if 20-50 hours “feels” right, just be aware that you’re estimating based partly on intuition. Make sure your estimate has a range that allows for the fact that there’s more definition yet to do. Number of Estimated Items The number of estimated items matters too. The fewer tasks you’re combining into an estimate, the less meaningful your estimate is. Nobody estimates items right on the nose. We’re either over or we’re under, for any given item. With a large number of items, we hope that the overs and the unders balance out. Too few items, and your estimate risks being derailed by a poor estimate for a single item. We abide by the rule of ten: estimates containing fewer than ten items are very dubious. However, the farther over 10 items you go, the more stable your math gets. For best results, try to have no fewer than twenty. Size of Estimate Ranges Just as important is for all of your estimate items to fit within a certain range of sizes. Huge disparities between your biggest and smallest estimate items suggest either that the small items have been broken down prematurely, or the big items are not yet broken down enough, adding uncertainty to your estimate and the project as a whole. As a rule of thumb, we try to have a range of no greater than 6x between the smallest and largest items in an estimate. Incomplete Tasks Lists Estimation is all about finding something to count; your estimate is only valid to the extent your list of work items is complete. If you’re missing 20% of your tasks, it doesn’t matter how good your estimates for the rest are. You can’t possibly know how many hours you will need for what’s not there. This is why our team prefers to make multiple estimates and only feels confident making commitments based on an end-of-foundation phase estimate. Once we’ve gone through a thorough foundation for a given body of work, we feel pretty confident that there are no major items that are completely missing from our list. Using the Right Tools To help our team estimate projects more effectively, we built a tool to put these ideas into practice. It uses some simple statistical math to create a master estimate out of a collection of work item estimates. These estimates could be generated by a single developer. Or, in a project with multiple people doing different kinds of work, the estimates could come from separate estimators. The tool can merge the estimates of multiple estimators. Just remember that, as with all estimation, the best estimates are created by the people who will actually do the work. We believe successful businesses are built on efficient solutions, so how could we not build one for ourselves? We call it our Three-Point Estimator. What is the Three-Point Estimator? To provide our clients with accurate estimates, we built a custom solution for our projects. It starts with a set of line items representing individual work items; usually, these are user stories, though occasionally they represent necessary technical tasks. Estimators give each item three different estimates. One number represents the “expected” or “most likely” case — if everything goes reasonably well it should take about this long. The other two numbers are a best case (if things go perfectly) and worst case (I can’t imagine it taking me longer than X). The algorithm then measures the span of this range. Wider ranges indicate more uncertainty in the estimate, while smaller ranges indicate more confidence. Finally, the algorithm combines the line items estimates into an overall estimate. Just like the individual line items, the overall estimate is a range. Generally, we don’t consider overly optimistic scenarios. We focus on project outcomes that, according to our tool, have a 60-90% chance of occurring. Sharing optimistic scenarios that only have a 1 in 3 chance of coming true, say, doesn’t serve the project well. Our final project estimate, therefore, considers not only the sum of the starting task estimate ranges but also factors in the uncertainty surrounding each item. This paints a clearer picture of what the project could require. Of course, we are always working toward more concrete numbers. The more work we do on a project, the closer we get to accurate numbers. This is why we often have a project-start estimate but provide a much tighter estimate following our Foundation phase. We’ve learned enough about the work through extensive conversations with the clients and end users to minimize uncertainty in our estimated ranges for hours of work required. Building Your Own Estimates During your project estimation, I recommend keeping these principles in mind: Your biggest risk is not in mis-estimating an item on your list — it’s in having no estimates at all for items that aren’t on the list, but should be. Good estimates start with lists of at least 20 distinct items. Estimated items shouldn’t vary too widely in size. Budget for non-development activities like meetings, interviews, project management, user acceptance testing, training. Then have a group meeting for a frank discussion around uncertainty in the project. Consider big gaps in estimates and what you need to know to reduce them. Then either track down that information or discuss and plan for best and worst case scenarios. Good luck with your next project estimate! Let us know if you have any questions in a comment below. The post Project Estimation Best Practices: A Look at Our Three-Point Estimator appeared first on Soliant Consulting.
View the full article
 

How to Get More From Your Salesforce Data with the Einstein Prediction Builder

Einstein is the new generation of AI available directly within your Salesforce CRM data. It learns from your existing data and can gives you predictions, insights, next steps, and recommendations. Why waste time calling on opportunities that are unlikely to convert? The Einstein Prediction Builder can help you anticipate which opportunities are most likely to close and even which products prospects are most likely to purchase. Einstein delivers scores; such as a lead score and how likely the lead will convert. You can leverage estimates; such as the estimated amount a donor will donate. You can also benefit from classifications; such as the category of a new lead. These focused efforts are for all users; sales, service, commerce, HR, non-profit. Anyone using Salesforce in any industry can benefit from AI fueling their decisions. From sales, to service to marketing, Einstein will give you end to end insights to increase your customer engagement. Where could Einstein save you time during the day? Einstein Bots and Agent Einstein also has Bots and Agent. Bots can be trained on your Salesforce data. They can automatically resolve routine customer issues or gather the needed information for a qualified agent handoff. The Einstein Agent can predict case fields and use AI to route them to the right agents for fast resolution. AI Built on Clicks, Not Code But what if you don’t have the time or developers to implement an AI solution? No problem! You can create predictions with clicks and not code. The simple Einstein Prediction Builder includes guided screens to select your object, fields, and the type of records you want the AI to learn from to build predictions. That’s it, it’s that simple! Customizing Your AI If you want to add complexity or customize the predictions, you can use Einstein’s API in Apex or Heroku. You can get more advanced with targeted segmented groups (accounts in a specific territory, for example). Crucial Component: Data Accuracy Remember Einstein learns from your data, so the more accurate it is, the more accurate predictions you will get. Einstein Voice Announced at Dreamforce 18, Einstein Voice uses natural language to understand updates you need to make in Salesforce. Just finished a sales meeting with a client and need to rush to the next one? That doesn’t leave much time to log into salesforce and type in notes. But you could dictate your notes to Einstein Voice, which will then find the correct record, log the event with all the relevant details, and even schedule reminders for you to follow up. Just like you’ve been able to relative terms like ‘next month’ in a report or list view for date fields, you can say ‘next month’ to Einstein and it will understand those kinds of terms. Click to view slideshow. Launching Einstein Prediction Builder Einstein Predictions and other AI features can boost your productivity and allow your team to focus their efforts. Einstein is built into the Salesforce platform, so you just have to purchase the license and then build out the predictions that will help you in your business. Currently (as of Q4 2018) pricing for Predictions is $75/user/month, which will give you access to view insights created with Einstein Discovery or Einstein Prediction Builder, get recommended actions, and have real time scoring. Need help launching your first round of predictions? Our team is well-versed in Salesforce development and implementations. We can help your team build predictions, both basic and complex, and take full advantage of all Einstein has to offer. Contact us for a free consultation today. Einstein Prediction Builder Resources Trailhead Pricing Einstein Prediction Builder Beta The Big Book of Customer Predictions (PDF) The post How to Get More From Your Salesforce Data with the Einstein Prediction Builder appeared first on Soliant Consulting.
View the full article
 

Passing Script Parameters: A Complete Summary from Simple to Complex

What Are Script Parameters? Script parameters are one of many tools that all FileMaker developers should learn to utilize when developing custom FileMaker applications. Numerous benefits include writing fewer scripts, increased code reusability, and improved coding organization. By designing your scripts to pass parameters, you can use a single script with logic branching (i.e., IF, ELSE IF, ELSE) to perform multiple actions within your FileMaker solution. For example, you can create a Sales Order record in your FileMaker solution. You can create a Sales Order from a few different contexts in your solution, such as from a Customer record, from a previous Sales Order, or from a cancelled Order. Often there are similar requirements when creating a Sales Order from each of these contexts. Maybe you need to know the corresponding customer for this order, or maybe you need to know shipping address. By passing script parameters you can use a single script to create a Sales Order from various contexts without having to write and maintain a separate script. There are many ways to format and pass scripts parameters in FileMaker and in this blog post we are going to start from the beginning and work our way up to the various ways we can pass multiple script parameters. This blog post borrows heavily from Makah Encarnacao's FileMaker DevCon 2018 presentation, so be sure to check out her video, slides and example file for more details. Single Script Parameter FileMaker provides one primary method to pass script parameters into a script. This is done via the "Specify Script" dialog available throughout FileMaker. Adding a script parameter is done by simply typing into the "Optional script parameter" field at the bottom of the Specify Script dialog. Anything you type into this field will default to quoted text, unless FileMaker detects that you have entered a specific field or function. One can also click the "Edit..." button to the right of the "Optional script parameter" field, to open a calculation dialog and provide a more complex script parameter. In most of our Multiple Parameter examples, we will be using this calculation dialog to determine the parameters we will pass. Figure 1 - Optional Script Parameter Once we pass a single parameter through the "Optional script parameter" section, we can retrieve that value in our script using the Get(ScriptParameter) function. We will use this function throughout this blog post. Multiple Parameters with Return Delimited Lists Sometimes, one may find that a single script parameter is all they need to send to a script, but often, as we develop more complex solutions, we find the need to pass multiple parameters into our scripts to yield the desired result. The simplest way to send multiple script parameters is by using the pilcrow/return character (¶) or List () function to pass our script parameter. For example, in our Specify Calculation dialog we may enter the following parameters to pass information about a specific manufacturer in our solution: MNF__Manufacturer::Name & ¶ & MNF__Manufacturer::Headquarters & ¶ & MNF__Manufacturer::NetWorth & ¶ & MNF__Manufacturer::DateFounded This will allow us to pass four lines of data from a Manufacturer record to our script. Inside our script we can separate each of these four lines into their own local variables by using the GetValue() and specifying each line in it's own "Set Variable" script step as shown in Figure 2. Figure 2 - Pass parameters with a return delimited list This method of passing multiple script parameters does have potential drawbacks. For example, if any of the fields from a record that you pass as a parameter has a return character in it, used when passing an entire paragraph as a parameter, it can potentially throw off your "Set Variable" script steps. This is because line returns are preserved in a field as part of parameter passing. There are ways around this drawback by capturing those return characters and converting them to another less common character. It's also worth noting that the order of script parameters must match the variables that you set in the ensuing script steps. This can lead to the wrong data being set into the wrong field, in turn, leading to difficult to troubleshoot bugs. What other options do we have that may help prevent these shortcomings? Multiple Parameters with Pipe Delimited Lists By not using the return character, we can use other less common characters, like the Pipe character (|), to delimit our script parameters. In this coding example, we replace the ¶ character with the | character as seen below: MNF__Manufacturer::Name & "|" & MNF__Manufacturer::Headquarters & "|" & MNF__Manufacturer::NetWorth & "|" & MNF__Manufacturer::DateFounded With this method, we streamline the passing of parameters, but we need to use a few more functions on the variable declaration side to properly parse out the various values contained in our script parameter. It is fairly straightforward to get the first and last value from the script parameters using the Left() and Right() functions. These functions allow us to use the position of a specific pipe character to determine when we should start and end the character parsing. Here is an example of returning the first value in the script parameters: Left( $parameters; Position ( $parameters ; "|" ; 1 ; 1 ) -1 ) /*Subtract one because we don't want the pipe itself to be part of the name*/ As we can see, this technique is more advanced and would require an understanding of the Left (), Right(), Middle() and Position() functions to retrieve multiple parameters. However, every developer should learn to utilize these powerful functions within FileMaker,within their custom applications. We have other methods to pass multiple script parameters, many which are more elegant than using the Pipe delimiter. Multiple Parameters with Let Function The Let() function is a native FileMaker function that often is a mystery box for new developers. Using the Let() function, you can not only make your complex calculations more readable and modifiable, but can also declare local and global variables inside it. Therefore, we can use the Let() function to pass parameter values AND also the variable names! This is a super powerful function indeed! See the below example of passing multiple script parameters using the Let() function: "Let( [ $name = MNF__Manufacturer::Name; $headquarters = MNF__Manufacturer::Headquarters ; $netWorth = MNF__Manufacturer::NetWorth; $dateFounded = MNF__Manufacturer::DateFounded ]; $name )" In the above code, we are passing our Let() function inside quotes to preserve the coding until we want to evaluate the parameters inside it. The first part of the Let() function is contained with square brackets "[]" and is where we can declare the local variables for each value. Each line inside this declaration is separated by a semicolon. Think of the semicolon as the end of a sentence in a declaration. When this Let() function is passed with quotes, we are able to wait to declare our variables when we are inside our called script. We can have FileMaker analyze this Let() function be using the Evaluate() function. This simply takes the parameter passed to the function and evaluates the text inside it. We can pass simple mathematical equations or other functions, like we see in the example in Figure 3: Figure 3 - Pass parameters with the Let() function With one script step, we are able to create multiple local variables for our script! See how the Let() function is super powerful? Use it! There is a drawback about this method of script parameter passing. The parameter that you pass is pretty verbose, and the potential for typos is higher. Typing the wrong variable name in your Let() function can have unexpected consequences. Make sure to test your script with various parameters in this method. Multiple Parameters with Custom Function An alternative to the Let() function includes using a pair of custom functions to declare and then set your script parameters to local variables. This allows predefining your script parameters as variables but with a simpler syntax and fewer script steps by using the "#Assign()" custom function and its related custom function "#". Visit FileMakerStandards.org to learn about these custom functions in detail. Here is an example of how to pass script parameters using the "#" custom function: # ( "name" ; MNF__Manufacturer::Name) & # ( "headquarters" ; MNF__Manufacturer::Headquarters ) & # ( "netWorth" ; MNF__Manufacturer::NetWorth ) & # ( "dateFounded" ; MNF__Manufacturer::DateFounded ) This custom function uses "name-value" pairing. The value on the left side becomes your variable name in the ensuing script; the value on the right becomes your variable value in the script. Once we pass our parameters in this format, we simply call the "#Assign()" custom function with the Get(ScriptParameter) function, as shown in Figure 4: Figure 4 - Pass parameters with a name value pairs function The result of this is multiple variables based off the "name-value" pairing that we defined in our script parameter.
This method does require importing custom functions into your solution. It also requires careful entry of name values to ensure variables are used correctly in your script. However, overall, this method provides an elegant and speedy way of passing script parameters and setting your script local variables Multiple Parameters with JSON FileMaker introduced JSON functions natively in FileMaker Pro with version 16, but custom JSON functions have also been around for some time. JSON stands for JavaScript Object Notation. You can store data in a similar name-value pair, as described in our custom function option above. With JSON, we can use native FileMaker functions to pass multiple script parameters using the JSONSetElement() function. The added power with JSON is nesting related data inside a JSON object. Think of this as the ability to send not only data from the current record you are viewing in FileMaker, but also related data, such as order line items or customer contact records. This allows for larger sets of data transportation in the well know JSON data format. See this example of multiple data sets in a single script parameter declaration: JSONSetElement ( "" ; ["name" ; MNF__Manufacturer::Name; JSONString]; ["headquarters" ; MNF__Manufacturer::Headquarters; JSONString]; ["netWorth" ; MNF__Manufacturer::NetWorth; JSONNumber]; ["dateFounded" ; MNF__Manufacturer::DateFounded; JSONString]; ["relatedModels" ; JSONSetElement ( "" ; ["model[0].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 1 ) ; JSONString]; ["model[1].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 2 ) ; JSONString]; ["model[2].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 3 ) ; JSONString]; ["model[3].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 4 ) ; JSONString]; ["model[4].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 5 ) ; JSONString]; ["model[0].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 1 ) ; JSONString]; ["model[1].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 2 ) ; JSONString]; ["model[2].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 3 ) ; JSONString]; ["model[3].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 4 ) ; JSONString]; ["model[4].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 5 ) ; JSONString]; ["model[0].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 1 ) ; JSONString]; ["model[1].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 2 ) ; JSONString]; ["model[2].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 3 ) ; JSONString]; ["model[3].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 4 ) ; JSONString]; ["model[4].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 5 ) ; JSONString] ); "" ] ) To set each script parameter as a variable, we can also use a custom function that works similar to #Assign () function but for the JSON format. Figure 5 showns an example using the JSONCreateVarsFromKeys () custom function: Figure 5 - Pass parameters using JSON If we don't want to use a custom function, we can use the JSONGetElement() function individually to grab each name-value pair. The custom function route takes very few lines of code, but the native function provides individual script steps that may aid in debugging. Another aspect to account for with using the native JSON functions is not all data types between JSON and FileMaker match up perfectly. For example, the date format that FileMaker uses is not directly supported in JSON, and we have to pass date values as a string. Here is a useful chart to see how each data type corresponds between the two formats: FileMaker Pro JSON Match Text String Number Number* Date String or Number Time String or Number Timestamp String or Number Container String Object Array Null Passing Containers Up to this point, we have described how to send parameters in some sort of text format, whether the data type is date, time, string or number. But what about binary data stored inside a FileMaker native container field? There are a few ways we can send container fields as a script parameter, and I will describe the drawbacks to each. The first method to transport container data via a script parameter is to use the native FileMaker Base64Encode() and Base64Decode() functions. The Base64Encode() function takes container data and converts it into a large block of text to make it easier to transport for script parameters or other data destinations. To reverse the Base64 encoding, use the Base64Decode() function to store back in it's native FileMaker data format. See this example of passing as Base64 encoded text: Base64Encode ( MNF__Manufacturer::Logo ) To reverse the process, we use the base64 decode function as shown in Figure 6: Figure 6 - Use the base64 decode function Unfortunately, some file metadata is lost in translation when Base64 encoding your container data, causing loss of important information in the encoding/decoding process. For example, a JPEG image going through this process loses information related to creation, modification, latitude, longitude, among other metadata. In some development situations, this is not an acceptable result. The alternative is to temporarily move the container field into a global container field. From there, set your destination field to the value stored in the global container. See an example of this shown in Figure 7: Figure 7 - Set the destination container to the global Conclusion As we can see, what started as a simple passing of one parameter to a script can be quite complex and varied. This should not dissuade one from using script parameters. Their benefits are numerous and can take your FileMaker development to the next level. By using script parameters, you can increase their development productivity and make it easier to streamline various functions within your database with less code. I hope that laying out the above options provides an informative matrix to navigate the possibilities that work best for your development practice and lead you to learn more about native and custom functions in FileMaker Pro. Have fun passing your parameters! Resource Makah Encarnacao's DevCon Presentation - FileMaker Community Questions? Leave a comment below or if you need help with your FileMaker solution, please contact our team. The post Passing Script Parameters: A Complete Summary from Simple to Complex appeared first on Soliant Consulting.
View the full article
×