hexagon logo

Change in accuracy from reducing pre-hit distance?

I’ve been trying to reduce CMM runtime in our QC department and I’ve found that using fly mode and reducing the pre-hit/retract distance by half, it cuts an average of 13% off the runtime for our product. I’m trying to make this a standard, so I want to show my time studies to management. Before I do, I wanted to make sure there were not any changes in accuracy.

From what I know, the only downfall to shortening the pre-hit distance from .1 to .05 is if measuring a non-consistent part like a cast part, the probe might move to close before taking a hit and throw an error when it touches the part. I also know it shortens the distance it will search past the theoretical point, but that can be adjusted by setting the check distance, right? So, is there any other reason you wouldn’t want to shorten the pre-hit distance?
Parents
  • Touchspeed has a large impact on measurement repeatability, however I've not found that calibrating with a prehit/retract of .1 and measuring with prehit/retract of .05-.2 have made any significant differences. Measuring with prehit and retract at something like .01-.04 have however had adverse effects on my evaluations.
    Then again, each machine seems to react a little differently, so it would be your best bet to do some testing at each workstation to see if these variables cause you any evaluation troubles.
Reply
  • Touchspeed has a large impact on measurement repeatability, however I've not found that calibrating with a prehit/retract of .1 and measuring with prehit/retract of .05-.2 have made any significant differences. Measuring with prehit and retract at something like .01-.04 have however had adverse effects on my evaluations.
    Then again, each machine seems to react a little differently, so it would be your best bet to do some testing at each workstation to see if these variables cause you any evaluation troubles.
Children
No Data