hexagon logo

CMM consistently reports .0001" higher than micrometer

Here is the cmm set up:

I'm using a Global S 7.10.7 Green

Probe Setup:
HH-AS8-T2.5
HP-S-X1H_T
HP-S-X1H_26_SH
HP-S-X1h_M3_5WAY
Connection #1 tip3x10mm
Connection #5 convert to m2thrd tip3x50mm

The room is temperature controlled to 68 degrees. Everything being measured has had time to temperature adjust to the room.
I have temperature compensation enabled.

I'm measuring a diameter that is 1.7720" +/- .0002"

I calibrate the angles with 25 hits each angle answering "no" to has the sphere moved. The sphere was set based off a 5x50 master.
I've tried calibrating the sphere with and without temperature compensation enabled.
I've noticed when I don't use temperature compensation my stddev can range from .0002 to .0003, I even noticed .001 one time. With temperature compensation the stddev is always less than a tenth.

I'm scanning outer diameter's. I scan one half T1A0B90 90 to -90 degrees with no filters and the other half -90 to 90 degrees T1A0T-90 with no filters.
I then construct a best fit circle out of the two halves from every single scan point using no filter.
I create a constructed filter out of the constructed circle and use Gaussian filter type with UPR of 177 and remove outliers set to 2. For the UPR I found an equation online that gives you a UPR based on the diameter: (D(mm)*PI)/.8
I construct a final best fit circle with no filters from the constructed filter and report this diameter.

Initially, I was consistently getting one to one results between the cmm measurements and a micrometer.

Recently however, I'm finding that the cmm is consistently reporting .0001" higher than what I measure using two different micrometers.

I know .0001" is extremely small and I'm trusting that the micrometers are giving me the more accurate results.

Am I doing something wrong?
Is it reasonable to assume that I can get the cmm and a micrometer to match to one 10 thousandths of an inch?
There has been a couple crashes (not huge crashes), could this have thrown off the probe by .0001"?

Thanks for any help in advance.
​​
Parents
  • .0001" is pretty darn good! Size to size is achievable but hard to come by in a production setting. Perhaps the CMM is taking into account the shape/roundness into account where your micrometer is only measuring two points... I don't know. A tenth is tiny. Do you have multi-point micrometers, I don't know if it's even worth the trouble... are you doing this for funzies or on a production part? If the latter what tolerance are you playing with?
Reply
  • .0001" is pretty darn good! Size to size is achievable but hard to come by in a production setting. Perhaps the CMM is taking into account the shape/roundness into account where your micrometer is only measuring two points... I don't know. A tenth is tiny. Do you have multi-point micrometers, I don't know if it's even worth the trouble... are you doing this for funzies or on a production part? If the latter what tolerance are you playing with?
Children
  • I'll have to check the roundness tomorrow, but i'm measuring a cylinder made up of 5 circles. Looking at each individual circle i measure in relatively the same spot with a mic and across the board its pretty much always .0001 higher at each circle. Im doing this for a production part and the tolerance is +/- .0002" so being .0001" off is half the tolerance. Whats driving me crazy is that for a while it was actually 1:1 between the mic and the cmm so i dont know what happened. At first it was the diameter was spot on but the runout was off and of course when i finally get runouts to match a dial indicator by 1-2 tenths for the most part, now the diameter's arent matching perfectly, go figure.
  • i had to squint to see how many 0's i was looking at...