I have just been bitten by SET DECIMALS. (Is there really any use forthis besides making programming just a bit more difficult?)
There may have been, when it was invented in the dBASE days. I believe it does not affect values, only their display.
I have a table with values to four decimal places. I use<vfp> transform(thevalue,"999999.9999") </vfp> to create the string representation. With SET DECIMALS set to its default value, a value of 0.0123 is converted to " 0.0100" which loses two digits of precision.
What if you multiplied the number by 10^4? Would the precision still be there, only not displayed?
I had this problem with another data item that had more than twodecimal places. I wrote a special function to handle it by setting SET DECIMALS to the number of decimal places I needed, doing the transform(), and setting SET DECIMALS back to the default.
Why did I do that?
That is probably the crux of the matter.
BUT also, because I really do not understand the point of SET DECIMALS.
Is there someplace you could look it up? A reference guide of some sort?