hexagon logo

Accuracy

Accuracy
Does the 4 to 1 rule apply to Portable CMM measuring accuracy, how do I best explain this to my superiors? The arm we use states a length accuracy of .016microns or .00062 inches so do I multiply this x 4 = .0025 and tell them that the expected accuracy of the arm would be this? Hence trying to verify position tolerances of .05 micron might be a stretch of the arms capability, even though I can repeat good measurements on a 20,’’ Weber gage within .0005 Every time no matter where I place it in the reach of the arm, this would include the Z axis.

I said that I could say with out a doubt we could verify position tolerances within .2 microns or .008’’, and I would not want them to lose work over my statement if I am not correct
  • Good luck with that. I wouldn't use a portable arm on anything with tolerances less than +/- a millimeter. The one I've been using isn't anywhere near as accurate as stated. In a perfect enviornment with a perfect operator measuring perfect parts you might have a slim chance of holding toleranes less than +/- .5 millimeters . Being as careful as I can, measuring the same part multiple times, I have a very difficult time repeating within .25mm, and just so you know, .2 (point two) microns is not .008", it's .0000078". if you actually meant two microns then its .000078". Eight thousandths is .203 millimeters (203 microns) which is about as good as it gets with a portable arm, but this is seriously pushing the limits when measuring real parts. Based on my experience I wouldn't even think about making the statement that you've made above to anyone.

    A Webber bar is NOT a real part. Go measure a few of your real parts and check for your own repeatability. Then tell your supervisors what it measures in the REAL world.

    This is based on my own experience, and is my best opinion.

    I just reread your post. Substitute the word milimeters everywhere you said microns and your numbers are correct.

    Bill
  • Agree with reservations...

    Hi Careful -

    You have an ironic name for this thread...

    I agree with all that Bill says about the millimeter/micron thing. Which makes me wonder how you have 8 years on an arm but would make such a faux pas in front of this audience. Astonished

    Regarding Romer accuracy, (you haven't ever checked this out during eight years? sorry) I agree also with Bill but not to his magnitude of pessimism. Wink I think - with a skilled manual CMM guy at the helm, good (not perfect) environmental conditions, a rigid, direct-connect fixturing method between arm and component - that very good accuracy and repeatability is possible with a Romer, at least one of recent vintage. I can't speak for the older 3000 series or even earlier, but I know the encoders are a lot better now than they used to be. I own an 2008 Infinite 2.0 3meter machine and I have been very happy in general, but I also better understand what Bill has also learned... sometimes admittedly, to my own consternation.

    You gotta be Careful. Rolling eyes very careful. Sheet metal, no bigger than 2/3 of the arm's range, +/-.015 all day. Machined parts +/-.005 is realistic with conditions described above. With the smaller parts, you might approach position within .005, but you have to have the trifecta, presuming also that your arm hasn't been banged around for eight years. I also run a client's arm that has been through the wringer, same year as mine, and I have significantly less confidence in that tool. Lucky for me, I am the only one to have ever held my Romer and all my equipment is pristine. Sunglasses Another part of the formula for Romer accuracy. If you have your arm C-clamped to a plate or some other missing link type of fixturing problem, forget it.

    Scott
  • Sorry, so as to not confuse the unit of measure issue in this thread any further, my numbers in the last paragraph above are in inches.Slight smile

    Scott
  • Bill, I'm interested in what the spec of your arm is. If it is as bad as you say and still in spec, thats the way it goes. If its measuring out of spec, why would you expect it to be accurate at all?

    I know we have had this argument before but it is exactly the reason we have standards such as the B89.4.22 so that customers are protected.
  • The calibration sticker states that it's repeatable to .016. I had been using it in an environment that is rough "shop floor". It's on the rolling base, with lock down feet, measuring weldments in a weld fixture/jig. This is the condition and environment that the salesman sold it for. Granted this is a rough environment but it does NOT measure to the spec that it was sold to. Yeah I know, I know better, I've been around a long time, but the fact is that I wouldn't trust a Romer to hold anything tighter than +/- .25 mill. Your environment and your parts may allow you to measure a lot closer and more repeatable than this and if it does good for you. The one I finally put in its case and hid in the storage room about a week ago doesn't.

    Bill
  • The calibration sticker states that it's repeatable to .016. I had been using it in an environment that is rough "shop floor". It's on the rolling base, with lock down feet, measuring weldments in a weld fixture/jig. This is the condition and environment that the salesman sold it for. Granted this is a rough environment but it does NOT measure to the spec that it was sold to. Yeah I know, I know better, I've been around a long time, but the fact is that I wouldn't trust a Romer to hold anything tighter than +/- .25 mill. Your environment and your parts may allow you to measure a lot closer and more repeatable than this and if it does good for you. The one I finally put in its case and hid in the storage room about a week ago doesn't.

    Bill


    SO Bill just to be sure about your numbers... you trust a Romer to +/- .00025"??
  • +/- 0.25mm = +/- 0.0098"


    Lets just call it .010" between friends.

    Bill
  • Lets just call it .010" between friends.

    Bill


    Point repeatability to 16 microns, sounds like you have a 1.8 meter Infinite, true?

    If so, this is maximum range over 2 of any of the 3 axis so if you turn probe comp off in PC-DMIS and articulate in a seat while taking points, no value of any axis should deviate by more than 32 microns. Of course this is in decent conditions and a good setup. If you don’t have a decent environment or are unable to set it up appropriately, then please understand that is why your numbers are so bad.

    If you cannot get your arm measure in spec under good conditions, please send it in for calibration rather than tell the whole world that the arms are incapable of measuring better than 250 microns.
  • +/- 0.25mm = +/- 0.0098"


    It was mill not mm ... big difference. Just saying.