Roughly determining the ID thread clocking to locate the thread start for maximum surface area for lead in chamfer diameter measurement. Basically, measuring the countersink of at most 1/3 the circumference of an internal thread entrance. I need to know the thread clocking to even make an attempt.
I understand this is one of those
"could you, but should you?" questions, but please humor me, our customer has requested that we try. The threads are small, 5/16s and 1/4in, the countersink diameter doesn't even exceed the major diameter. I think at minimum, one might be able to measure the countersink angle and the radius of the countersink edge to the center of the thread minor with two lines and an intersection point. It doesn't help matters much that nobody actually models threads this small.
Imagine if this was the ONLY thing the program needed to do was this ONE task, to measure the clocking of a small thread.
My guess is, step one would be to locate the minor diameter of the thread and translate my alignment to the intersection of the thread to the surrounding surface. Then come down some distance and use a self-centering 2D vector point to nest my stylus ball between two adjacent thread minor crests. So then I would have a depth measurement from that point to the surface. I would think one could divide that distance by the pitch of the thread to get an integer number and a fractional remainder. If my point was directly below the thread start, the division would result in a clean integer with no remainder. If there is a remainder, then in theory one could relate that remainder to 360° circumference to estimate the thread clocking.
We aren't going to use thread gauges, pins, or tru-pos. The only tools we have at our disposal are the thread, the surface, and the stylus.
Am I on the right track? has anyone tried this before? Are there any productive suggestions? Stylus diameter? self-centering points? scanning?
I was so slammed this weekend I didn't get a chance to investigate our hardware configurations or work on this problem at all. three consecutive 14hour days.
louisd I think we have X1H probe bodies, I don't know what controller we have, we have CAD++. One interesting clue might be that our stylus calibrations error out whenever we try unchecking the TRAX calibration box.
Paganini That sounds like a great idea. It also sounds familiar, see my initial post. Thank you though. The line and point matrix idea might get me out of my bind with the self-centering points crashing my software.
Benedictj1 Lol, our CMMs sh!t the bed HARD scanning chamfers. The probe spirals into the bore as it scans, slipping down the chamfer at about 0.015" per inch of travel. I've had better luck scanning a line across the surface and falling into the bore, then using LINESEGMENTEND to grab the last point before the chamfer. I have found this isnt very accurate at all, but ill admit I haven't understood it fully.
I was so slammed this weekend I didn't get a chance to investigate our hardware configurations or work on this problem at all. three consecutive 14hour days.
louisd I think we have X1H probe bodies, I don't know what controller we have, we have CAD++. One interesting clue might be that our stylus calibrations error out whenever we try unchecking the TRAX calibration box.
Paganini That sounds like a great idea. It also sounds familiar, see my initial post. Thank you though. The line and point matrix idea might get me out of my bind with the self-centering points crashing my software.
Benedictj1 Lol, our CMMs sh!t the bed HARD scanning chamfers. The probe spirals into the bore as it scans, slipping down the chamfer at about 0.015" per inch of travel. I've had better luck scanning a line across the surface and falling into the bore, then using LINESEGMENTEND to grab the last point before the chamfer. I have found this isnt very accurate at all, but ill admit I haven't understood it fully.