hexagon logo

UPDATE command

I have known about the update command and have even posted it as a possible solution for other users. I have never used it.

So, my question is, have any of you ever had any issues with using the UPDATE command?

I know some of you have had issues with databases getting messed up and then never being able to get any of the data back out of them, do you same people use UPDATE?

The reason I am asking is that I am going to ahve to run a rather large study soon (like, 300 or more!) and I want to automate the process as much as I can. Looping, UPDATE, that kind of thing, with a simple COMMENT to pause the loop and just pressing the DONE button on the remote to get it going through the next cycle. But, for obvious reasons, I do not want to risk losing any of the data and then have to start over (THAT WOULD SUCK!!!!!!).

So, are there ANY draw-backs to using the UPDATE command? What is gonna bite me in the rump on this?
  • The UPDATE command calls DPUPDATE.EXE to update the database. There is a somewhat undocumented feature that if a file named "STATUTIL.BAT" exists in the pcdmis folder, DPUPDATE will create a copy of the XSTATS11.TMP file, move it to a subdirectory called "STAT", and then process the original file. The STATUTIL.BAT file does not need to do anything - just exist. It could be used to copy the stats file to another location to save it.
    The saved name would be in the format:
    yyMddHttt.TMP where:
    yy - year
    M - month where A - Jan, B - Feb, ...
    dd - day
    H - hour where A = 0, B - 1, (24 hour format)
    ttt - minute & second
    So the data could be sent to the DB again if needed.

    Examples of STATUTIL.BAT:
    ! don't do anything - just can't be an empty file

    or

    @echo off
    ! copy duplicate stats file to a 'safe' location
    copy %1 c:\temp\.
  • I used to loop and update stats automatically, never had problems. However on machines here that I update automatically not looping I have had numerous problems, mostly corrupt databases. I can't say they are related and don't believe it should be but that is the one common element. Another difference is before when I did not have issues it was small loops of 30-50 pieces and few dimensions (10-20). Now when I have problems it is a situation where I have thousands of transactions in a database with up to 50 dimensions or more. I'd be inclined to believe that would be more the culprit. Still though I can not say what it is that causes my recent woes. I used to have no issues with Datapage but now days it frustrates me to the point I gave up on it.

    I send all of my data comma delimited to a text file for importing when I need to into an application other than datapage (such as Excel). No issues.

    If you are not having problems with Datapage now, I am inclined to believe that automatically updating stats will not give you any problems. My hunch is that is not what is giving me my problems here.
  • I used to loop and update stats automatically, never had problems. However on machines here that I update automatically not looping I have had numerous problems, mostly corrupt databases. I can't say they are related and don't believe it should be but that is the one common element. Another difference is before when I did not have issues it was small loops of 30-50 pieces and few dimensions (10-20). Now when I have problems it is a situation where I have thousands of transactions in a database with up to 50 dimensions or more. I'd be inclined to believe that would be more the culprit. Still though I can not say what it is that causes my recent woes. I used to have no issues with Datapage but now days it frustrates me to the point I gave up on it.

    I send all of my data comma delimited to a text file for importing when I need to into an application other than datapage (such as Excel). No issues.

    If you are not having problems with Datapage now, I am inclined to believe that automatically updating stats will not give you any problems. My hunch is that is not what is giving me my problems here.


    Good info, boogerboy, but I do have a question for you (weel, 2 anyway):
    1) Do you use seperate databases for each part or group of parts or are they all in one database? (I don't)
    2) Do you periodically clean out your database(s)?

    The reason for question #2 is because even when you delete things (parts, data, etc.) from the Datapage Database, it is NOT deleted, but just 'hidden' in the database where you can not see it anymore. Every 4 to 6 months, I cleanout the database, I archive off old jobs that have been sold off and are no longer here, delete the parts from the database, then do an ASCII dump of all the remaining data (all at once, works fine). I then make a copy of all the current database file (place them in a sub-dir), then copy an empty set of files to the databse directory, giving me a fresh start, I then ASCII load the still-current data and I am good to go. If you don't do this, sooner or later (sooner!) your database will grow to a size that will rival your virt-memory file size. Being construction, not production, parts/jobs typically are here for less than a year so I don't need to keep huge amounts of data in my databse.
  • I have separate databases for families of parts or customers. Each database typically contains several part numbers (CMM programs). In regards to number 2 what I do is when the database starts getting big I move it to a backup location and create a new database. I only ASCII dump info when I have to send it to someone for analysis that does not have Datapage.

    I think you'll be fine. I am inclined to believe (can't prove it) that my issues are related to size/amount of transactions or data. When I started archiving large databases and starting fresh I ran into less issues. But then it became more work and a pain in the butt to merge data.

    These days I am fine with text files. They are shareable which is nice as I am finding less and less people that have or even want Datapage. The ASCII dump for Datapage is nice for that too though.