hexagon logo

Change in accuracy from reducing pre-hit distance?

I’ve been trying to reduce CMM runtime in our QC department and I’ve found that using fly mode and reducing the pre-hit/retract distance by half, it cuts an average of 13% off the runtime for our product. I’m trying to make this a standard, so I want to show my time studies to management. Before I do, I wanted to make sure there were not any changes in accuracy.

From what I know, the only downfall to shortening the pre-hit distance from .1 to .05 is if measuring a non-consistent part like a cast part, the probe might move to close before taking a hit and throw an error when it touches the part. I also know it shortens the distance it will search past the theoretical point, but that can be adjusted by setting the check distance, right? So, is there any other reason you wouldn’t want to shorten the pre-hit distance?
Parents
  • One thing I forgot to mention earlier was that the probes need to be calibrated with the prehit and retract distances you are using. If you shorten up the prehit and retract on some programs, but not all, then the calibrations for each probe will need to be managed differently. Can you trust your operators to use the correct calibration for their programs if they differ from one program to another? If you only have one part in each machine and run them 24/7, that is easy to manage. If not, things can get out of hand quickly.
  • I think the main problem is they just don’t want to change anything, so once a new default is set they wouldn’t change that.
Reply Children
No Data