The room is temperature controlled to 68 degrees. Everything being measured has had time to temperature adjust to the room.
I have temperature compensation enabled.
I'm measuring a diameter that is 1.7720" +/- .0002"
I calibrate the angles with 25 hits each angle answering "no" to has the sphere moved. The sphere was set based off a 5x50 master.
I've tried calibrating the sphere with and without temperature compensation enabled.
I've noticed when I don't use temperature compensation my stddev can range from .0002 to .0003, I even noticed .001 one time. With temperature compensation the stddev is always less than a tenth.
I'm scanning outer diameter's. I scan one half T1A0B90 90 to -90 degrees with no filters and the other half -90 to 90 degrees T1A0T-90 with no filters.
I then construct a best fit circle out of the two halves from every single scan point using no filter.
I create a constructed filter out of the constructed circle and use Gaussian filter type with UPR of 177 and remove outliers set to 2. For the UPR I found an equation online that gives you a UPR based on the diameter: (D(mm)*PI)/.8
I construct a final best fit circle with no filters from the constructed filter and report this diameter.
Initially, I was consistently getting one to one results between the cmm measurements and a micrometer.
Recently however, I'm finding that the cmm is consistently reporting .0001" higher than what I measure using two different micrometers.
I know .0001" is extremely small and I'm trusting that the micrometers are giving me the more accurate results.
Am I doing something wrong?
Is it reasonable to assume that I can get the cmm and a micrometer to match to one 10 thousandths of an inch?
There has been a couple crashes (not huge crashes), could this have thrown off the probe by .0001"?
I just calibrated using a scan by checking the calibrate scan rdv and I think I'm seeing better results now.
Out of the five circles measured the roundness ranges from .0001 to .0004.
I was trying to do what you said with the feature set scan but it keeps giving me an expression error.
What I did is i measured the distance from the first point to the last point 180 degrees apart. The CMM and the mic were pretty much dead on with a difference of only
.00005".
Per the F1 info for the LSPX probe, I recall the 5x50 being desired (but not required) for probe Rack Calibration. But Lower matrix calibration is to be a 5x20 with a probe named "LSPX1H_CAL_PROBECHANGER". Got too much on my plate today to dig up & quote the actual F1 help info. I set my master to a 5x20 (named as "LSPX1H_CAL_PROBECHANGER") and calibrate all racks with it as well, with zero issues.
why not report to 5 decimal places? the probe sensor is capable of it.
Flip the part over (if you can) and reassess. Make sure the data follows the part. This will help lock down if it's truly the part or the machine causing the delta.
Least squares cylinder (or circles) with 0.0004" roundness can absolutely be the cause of your 0.0001" diameter discrepancy.
Another good sanity check is to view the hit deviation, rotate part 90° and affirm the pattern of errors rotate with the part. If not, you've got FOD on your probe tip or bias in your probe's calibated state.
Finished calibrating the lower matrix. The matrix looked good from what I read, no whole numbers and less than .15 difference between top left and middle middle.
Still reporting .0001" higher though.
I took off the scanning probe and reseated it. It was then reporting 3 tenths off. So I re-did the lower matrix, recalibrated everything and boom its looking like the measurements are lining up with the micrometer again. I don't want to speak to soon though, we'll see what it decides to do next week. Maybe the last time it crashed it somehow caused the probe to slightly unseat?? Is that even possible idk but thanks for your help.