Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Help with accuracy
Hi everyone I am having tons of problems with accuracy. I have Igagging older scales and nothing I do is acceptable. I got most of the jumping out except for .0004 that happens occasionally. That is not the issue, the accuracy is out after I calibrate. I can change the CPI to get it to land on 3.0000 but when I move it does not stay close enough. If I move a small amount it works. Say I move .015, the reading is within a couple tenths. If I then move 3" I get 2.89. It seems I have a small error and it multiplies. Is there any care I should take in mounting the scales? They were attached when I got the mill, but I have since taken them on and off multiple times. Any help will be greatly appreciated.
Did you calibrate the scales?
I did, I used your method on youtube with 1-2-3 blocks. It took me a little while to get the CPI right to get it to show 3". After that is when I get issues. I could have done that wrong I basically went in ascending order for the CPI till it read 3". Could the scales be causing this if they are not mounted perfectly parallel with the axis?
In my experience with the iGaging scales, I ended up not worrying about the readout reading 3.000" after changing the CPI, as shown in Yuriy's video. I kept having the same trouble you are.

It's cumbersome, but I ended up indicating the 123 block over and over. Zero the indicator on the block, zero the readout, carefully remove the block , move the axis until the indicator reads zero again, check the readout, adjust the CPI a little (remember that you can use decimals), move the axis back to the starting point, repeat. Eventually I got it to read dead nuts.
YouTube: FnA-Wright Engineering
I will have to mess with it then. It really is frustrating because it makes no sense to me. I just went and set up a dial indicator, zeroed on the side of a 1-2-3. Moved down to the table and my reading was 1.6938 then I zeroed the Z axis on the tablet and moved .050. The tablet read .0497. That makes no sense to me, I really am at a loss. Thanks for taking the time to reply, I will report back if anything changes.
Something just occurred to me, though please correct me if I'm wrong, or overthinking things. I'm pretty sure that these, and pretty much all scales from China, are metric... The conversion to inches is done in the app. This got me thinking: It might be more accurate to calibrate the scale using millimeters, because it would eliminate any rounding done within the app during the conversion to inches.

So, I set up a 123 block as a stop, and zeroed my 0.0001" indicator on a 3" gauge block up against the 123 block. Then, I changed the CPI for the X axis to 3. Next, back at the main screen, I swapped the readout to mm, and zeroed the axis. I then removed the gauge block, and cranked over until my indicator zeroed onto the 123 block. The readout read 65,210.267. (If you do this, I would recommend doing several approaches to the indicator, to make sure you get the same reading every time...)

Now, since we're in metric, I divided that number by 25.4, and got 2567.33335. Entered that into my CPI for that axis, and repeated the test with the readout set to inches... 3.0000" dead. I rezeroed the indicator on the 3" gauge block, then swapped it for a 2" block I cranked in until the indicator zeroed, and the readout showed -1.0000. Again with a 1" block, then an 0.750", and a few others. It always showed the correct measurement, sometimes flickering +/- 0.0002". Given that you shouldn't expect better than 3 decimal places worth of accuracy with these scales, that's well within their resolution.

Again, if I'm off base here, I encourage Yuriy (or anyone) to please correct me.
YouTube: FnA-Wright Engineering
The app is metric under the hood (the base unit is micron). You can get very close with calibration (counts per inch can be decimal), but iGaging scales are neither metric nor imperial, actually. Most other scales have a CPI of 2540 or 10 microns; iGaging scales have a CPI of 2560 (I suspect something got lost in translation when they were designed).

For all practical purposes, the last digit is garbage. Original displays just hide the flicker from you by averaging 8 readings, but the scale can't reliably resolve it. I'm trying to do something in the new version of the firmware to hide the flicker, but haven't finalized the code yet.

All that said, I'd like to get to the bottom of why the calibration procedure isn't working. In the last few years, I've heard from about a dozen people that had exactly the same issue, but I can't reproduce it, so there is something that is interfering with the procedure, either in the settings or in the initial state of the app.
Calibration is really as simple as:
1. Change the display to see the raw counts
2. Move a known distance (in inches)
3. Divide raw count by the distance (in inches) moved

Hey man thank you for taking the time to help. So far from what I can tell your method of calibration has worked. I just went down and did the X, and Y. So far it seems to work. I even got out my gage blocks and set a bunch up and tested different values and I got a reading of +/- .0004. I am going to run a few more test to see how it works out over longer distances of travel (7+ inches). It must not be a scale issue but user error with calibration.

Forum Jump:

Users browsing this thread: 1 Guest(s)