The room is temperature controlled to 68 degrees. Everything being measured has had time to temperature adjust to the room.
I have temperature compensation enabled.
I'm measuring a diameter that is 1.7720" +/- .0002"
I calibrate the angles with 25 hits each angle answering "no" to has the sphere moved. The sphere was set based off a 5x50 master.
I've tried calibrating the sphere with and without temperature compensation enabled.
I've noticed when I don't use temperature compensation my stddev can range from .0002 to .0003, I even noticed .001 one time. With temperature compensation the stddev is always less than a tenth.
I'm scanning outer diameter's. I scan one half T1A0B90 90 to -90 degrees with no filters and the other half -90 to 90 degrees T1A0T-90 with no filters.
I then construct a best fit circle out of the two halves from every single scan point using no filter.
I create a constructed filter out of the constructed circle and use Gaussian filter type with UPR of 177 and remove outliers set to 2. For the UPR I found an equation online that gives you a UPR based on the diameter: (D(mm)*PI)/.8
I construct a final best fit circle with no filters from the constructed filter and report this diameter.
Initially, I was consistently getting one to one results between the cmm measurements and a micrometer.
Recently however, I'm finding that the cmm is consistently reporting .0001" higher than what I measure using two different micrometers.
I know .0001" is extremely small and I'm trusting that the micrometers are giving me the more accurate results.
Am I doing something wrong?
Is it reasonable to assume that I can get the cmm and a micrometer to match to one 10 thousandths of an inch?
There has been a couple crashes (not huge crashes), could this have thrown off the probe by .0001"?