On 2016-11-03 17:31, Charlie wrote:
On 11/3/2016 2:24 PM, Gene Wirchenko wrote: [snip]
I'm handling one of Jeff Johnson's (RIP) clients and having this issue. I was about to create a simple utility for the end-user to grab the table and pack it, but curious if you've got a better idea?
One thing I put into my design is working with temporary cursors or objects (aka from SCATTER MEMO NAME oDataRec....). I rarely have user interface objects directly accessing DBFs.
As I recall, the main reason for memo bloat was caused when saving data. Specifically, if I was editing a memo, then saved it, if the contents were larger than my "blocksize", VFP would create a whole new "block" in the .fpt file. The original block is essentially wasted space at that point. I think there are some nuances to this: e.g. VFP doesn't see it as "bigger" unless the "blocksize" threshold is crossed
- or maybe the text has to be "bigger" than the text 1st edited. I
can't remember.
Just an update: found 2 records that had bazillions lines in them (determined via MEMLINES), and the process was continually adding another line every 6 minutes since August 2016. No wonder the file size spiraled. Problem source fixed from vendor side. All good now.