To prevent adding a duplicate record to a database using Recordsets, is it faster to test a Field, Indexed with No Dups using a DLookup, or by simply trapping the error created when the .Update fails?
Scott Axton 13 months ago
My preference is the DLookup. I'd rather bypass the records than deal with the error created.
I don't know that one is faster than the other unless you factor in manual intervention.
Gary James 13 months ago
The reason I ask is because I get updates of CSV data files that has a mix of previous and new records. The records can run in the millions, and can take hours to process. Anything to speed up the adding of new records and skipping the old, could reduce the overall time considerably.
Kevin Yip 13 months ago
Trapping errors is the last resort for everything. If .Update caused a run-time error, it could be anything: could be a lost link to a table, network outage, Internet outage, etc. Any number of things could cause .Update to fail. You would have to check the error code to see what the error was. Different types of errors return different error codes: lost connection, key violation, write conflict, or things you don't necessarily know about. It is not a trivial task to design this kind of error handling. And that answers your question on which is faster, if you take into account the time it would take to write such an error handler. Error handling (On Error Goto...) is generally used for issues you don't necessarily know about -- that's because if you knew, you would do your best to avoid them. For issues you know about, you always write specific code to tackle the issues.
Gary James 13 months ago
As far as handling errors is concerned, if I get an error on .Update it makes no difference what the error is. It represents a failure adding the record to the database. In this case I just move on to the next record. This is why I'm curious if the low level SQL Server database management code runs faster testing the indexes for a match and creating an error, than the VBA high level code to build up a criteria string, run the DLookup, and test the return value for a match. If nobody has any information on this I might write some test code to try to add 100,000 duplicate records and check the time it takes to finish the test using each method.
Kevin Yip 13 months ago
Error handling is not there to convenience you, but to let you know what *exactly* the error was. If Update causes an error, it may NOT be due to a duplicate key, but something else entirely that you need to know. Again, trapping an error in such a fashion is a last resort. And it only works if you are sure the error you expect to get is the ONLY error that could ever happen in that code. Pretty big if.
Gary James 13 months ago
Kevin, you've gone off topic to argue semantics about handling errors. The point of my question was entirely about performance. When I'm processing new user data 100% of the time the record is successfully added to the database. Which means there is no failure on the .Update and the DLookup isn't even needed. This was the case when I first created the database and my data was entirely unique. But now that I'm entering mixed new and old records, the performance of DLookup vs. error trapping to handle duplicate entries becomes significant. I appreciate your input, but I don't want to engage in a debate about something unrelated to my original post. I fully understand error handling, I've been in software design for the past 40 years.
Scott Axton 13 months ago
Gary -
Check out the DLookup Slow video. In it Richard discusses optimizing queries and DLookup.
Things like Indexes, running local vs over network, etc can all affect performance. Though I don't have experience and I can't even remember what they are called, I know you can run a query at the server side in SQL.
You mentioned testing out SQL Express and I think given the numbers of records you are talking about it is worth a shot. What do you have to lose, other than maybe some time invested?
Kevin Yip 13 months ago
Hi Gary, again, the bottom line is that an error from Update is NOT necessarily from duplicate key entry. So why are you using this to check duplicate entry? It's a perfectly on-topic thing to point out that your method simply doesn't work. You may understand error handling, but you don't seem to know that many different errors could cause Update to fail. Having N years of experience may work against you if it makes you think you can do no wrong.
Gary James 13 months ago
Scott, i went through the process of installing and moving my empty database over to SQL Server Express using the SQL Server Migration Assistant. Everything went smooth and I've added near 8 million new records to the database. However the Migration tool added something odd to the Tables section of my Access Objects list. They appear to be a mirror of the actual linked tables.
Gary James 13 months ago
Scott Axton 13 months ago
Gary, as I mentioned in my previous post, I don't have any experience using SQL Express or Server. I'm following along with your posts hoping I can learn from your experience! lol
Hopefully someone that does have knowledge will chime in here and we'll both benefit!
Good luck
Sorry, only students may add comments.
Click here for more
information on how you can set up an account.
If you are a Visitor, go ahead and post your reply as a
new comment, and we'll move it here for you
once it's approved. Be sure to use the same name and email address.
This thread is now CLOSED. If you wish to comment, start a NEW discussion in
Microsoft Access Courses.
The following is a paid advertisement
Computer Learning Zone is not responsible for any content shown or offers made by these ads.