At the beginning of each of my shifts, I calibrate my stylus to 0.001 mm. 80% of the time I only use it in position a0 b0. And when it shows 0.002 (or even 0.003) I start wiping everything in the world until it is 0.001. Sometimes it takes a long time. How do you solve this problem?
jjewell When the only common angle we run is A0B0 and the amount of time it would take to calibrate every angle, I don't see how it is a waste of time. Instead of making simple condescending remarks to
Vladimir and myself, why don't you enlighten us.
Not trying to highjack this but commenting on a comment above. If you are NOT a little OCD and work on a CMM / inspection , you are probably a crappy programmer and not right for this job / industry....it all comes down to the right level of OCD for the task at hand....just sayin
THIS WASNT A SHOT AT ANYONE !!!! Just a joke. I think everyone who is on here is OCD to a point. That's what makes us good at what we do. Attention to details and such..
Yup, but this world is not the same....nobody can take a joke anymore, they get defensive, asap......let me off the this ride please sir, I am about to puke
You didn't answer my question, first of all....NASA ? Remember ? These microns mean nothing to 99% of the industries out there...figure out your uncertainty....add in the machine accuracy, repeatability, the ttp uncertainty....and then you will see the light and understand that worrying about 2 microns is like worrying how far off the earth of its normal axis.... Consider yourself enlightened....and condesended....BUT, if you think it is not a waste of time, have it at...but my 35 years of experience on a cmm in many industries, tells me its a waste of the company's money