Tim W Posted April 18, 2005 Share Posted April 18, 2005 Hi all, Question is this: I have an sql data set that I wish to routinely import into fm. The data is customer addresses from our accounting system. Each customer can have multiple addresses. The source file is set up as follows: CustNo AddressCode Addr1, etc. The keys for the accounting system are not compound. But when I import into fm I have not been able to figure out how to avoid duplicate records. A compound key on the source side would be nice but I can't change that. My current idea for a work around involves an import followed by a check for & delete duplicates routine. I need a scripted solution. Any better ideas will be received with great joy, because I know the work around will take more time than I would like. The current source file has 4000+ records. Is there a way to use a validation to avoid the duplicate records? Link to comment Share on other sites More sharing options...
Tim W Posted April 18, 2005 Author Share Posted April 18, 2005 Resorted to plan B. An intermidiary fm file to take in the raw export and calculate a compound key. Match the compound keys and import only updates & new adds with a second import into the live file. A little maintenance keeps the intermidiary file clean. This is better than my previous workaround. HTH, Tim Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.