At the beginning of each of my shifts, I calibrate my stylus to 0.001 mm. 80% of the time I only use it in position a0 b0. And when it shows 0.002 (or even 0.003) I start wiping everything in the world until it is 0.001. Sometimes it takes a long time. How do you solve this problem?
A-machine-insp I calibrate before every part as well for the same reason
jjewell Idk what you are going on about? you are a demi-guru on here and a member since 2006. I hope that you calibrate your probes..
I don't think you understood what A-machine-insp was saying. I don't think that he was saying he needs .001 micron deviation. He was just pointing out that calibrating before running is good enough as long as it doesn't fail the cal.
Maybe you work in oil and gas with loose tolerances or you work at a forge and don't notice a difference but for me I am doing | TP | .002 | A | B | C | at complex angles.
I pretty much calibrate every time I go to a different program unless it is just A0B0 and that is sufficient unless I am doing something special
A-machine-insp I calibrate before every part as well for the same reason
jjewell Idk what you are going on about? you are a demi-guru on here and a member since 2006. I hope that you calibrate your probes..
I don't think you understood what A-machine-insp was saying. I don't think that he was saying he needs .001 micron deviation. He was just pointing out that calibrating before running is good enough as long as it doesn't fail the cal.
Maybe you work in oil and gas with loose tolerances or you work at a forge and don't notice a difference but for me I am doing | TP | .002 | A | B | C | at complex angles.
I pretty much calibrate every time I go to a different program unless it is just A0B0 and that is sufficient unless I am doing something special