You need a lot of optimized code. It's good that I have a project, and I have Succefully that it works with vba (The stackoverflow developers mostly helped. Thanks for that) But today I got Feedback. Its deletion of 2 more unique entries in the record. But I do not know why its removing them.
The algorithm that I applied
I used the COUNTIF function that I found on google
="countif(A$1:A2,A3)=0" A3 is the active cell, Checks A2,A1 for dupes
It Throws False if there is a duplicate in column A, and True. If it is unique. What I understood about Countif is that He checks all the above column values โโfrom this cell, I mean, take A4. SO checks A2, A1, A3 for duplicate. Similarly, A10 checks A1 on A9 and throws TRUE or False.Well. He works. But I donโt know what went wrong. The code does not work for some entries. Sometimes it shows False for unique entries.
And he has more time to apply this formula, since I have more data. I am trying to make it cleaner and more optimizing. People told me that this is not c or some other language so that it will be optimized, but I need code that makes my code more optimized.
I need code for these conditions, can someone help me, since my account could not. A bit helpless at that.
1) I have a column and I have to check for duplicates in this column and delete this row if it is a duplicate
2) I have 35,000 old records in a column, and I have 2000 new records every time they are added. I need to check these 2000 records out of a total of 37000 (since we added 35000 + 2000), and this delete operation should only be performed on newly added 2000 records, but it should check for duplicates for the entire column
Let me explain to you that I have recently added 2,000 records, therefore only these records should be checked for duplicates of 35,000 records, as well as from myself (2,000 records) and delete it if this is a duplicate and does not duplicate the operation should be performed at 35,000 records of old data.
I found several codes, but they even delete duplicates from 35,000 records. I set the range but didn't even work. Can someone help me with better code that takes less time? Please thanks
Updating my question with the code code I have
ABFGHIY PTY 39868.5 4 2 540 3 PTY39868.5425403 GTY 34446.1234 2 1 230 1 GTY34446.1234212301 PTY 3945.678 2 2 PTY3945.67822 GTY 34446.1234 2 1 230 1 GTY34446.1234212301 let us say these are old 35000 entries
Explanation of the above example.
Above are 35,000 records. I have to check columns A, B, F, G, H, I'm for cheating, if they are the same, that I have to delete the row, I should not worry about other columns c, d, etc., So I did it I used one unused column Y and combined these 6 columns into 1 column Y using these
= A2 & B2 & F2 & G2 & H2 &I2 with the respective columns
Now check the Y column for duplicates and delete the entire row. since 2003 only supports one column as far as I know.
Please note that even 35,000 records may have duplicates, but I should not delete them. For example, you can see that 2 and the last line in my sample code are tricks, but I should not delete because this is old data.
ABFGHIY PTY 39868.5 4 2 540 3 PTY39868.5425403 'old GTY 34446.1234 2 1 230 1 GTY34446.1234212301 'old PTY 3945.678 2 2 PTY3945.67822 'old GTY 34446.1234 2 1 230 1 GTY34446.1234212301 'old PTY 3945.678 1 1 230 2 PTY3945.678112302 'new PTY 39868.5 4 2 540 3 PTY39868.5425403 'new PTY 3945.678 1 1 230 2 PTY3945.678112302 'new
Now notice that the new PTY record (from the last second) is a duplicate of the original record (PTY first). Therefore, I hava to remove it. And the last new record is a duplicate of the newest record, so I have to delete this even this. SO in the above code, I need to remove only the last 2 lines, which are cheats of the original record, as well as from it. But you should not delete GTY, which is a hoax, but which is in the original recording.
I think now I have made it clear. Connects them into one cell. Is this the best way to get closer? like conactenatin for 40,000 records, taking only 2 seconds, I think it doesnโt matter, but any algorithms for them are significantly justified
I heard that the oral recommendations 45.00 and 45.00000 differ from each other in that this could be a problem with this? since I have decimal points in my data. I think I should have done
= I2 & H2 & G2 & F2 & A2 & B2
which is better to concatenate? Is this or something else that I wrote before?