Tue 11 Mar 2014 04:34:14 PM UTC, comment #9:
> So, this bug can be closed invalid, but I can't seem to be allowed to do so myself.
I think that's because it was filed as "anonymous" (I guess that you were not logged in when you filed it). I'll close it.
Thanks for checking with the maintainer of the file.
|
Tue 11 Mar 2014 04:11:18 PM UTC, comment #8:
Yes, you are right on all counts.
I have visited the upstream maintainer of the offending files
today and indeed the variables carry a too long string
of boolean values, so that's utterly misdefined variables.
Good thing the upstream is alive, so fixed versions are
under preparation.
So, this bug can be closed invalid, but I can't seem to be allowed to do so myself.
Thanks for the explanation.
|
Mon 10 Mar 2014 08:20:39 PM UTC, comment #7:
> I thought SPSS allowed for more or less arbitrary precision,
> and expected pspp to mimic this.
I do not think that SPSS does this. Even if it did, then the additional precision past 64 bits would be lost whenever you write data to a .sav file, because the .sav file format uses 64-bit floating point numbers.
PSPP uses 64-bit floats internally.
|
Mon 10 Mar 2014 04:58:20 PM UTC, comment #6:
OK, I see the limits to the precision when this is handled as a "real" float.
I thought SPSS allowed for more or less arbitrary precision,
and expected pspp to mimic this.
Using bc or simply expr, even larger precision numbers are handled properly.
(Yes, I know that's a really expensive thing to do,
so that may just not be practical.)
Granted, if the field definition is "sane" and the number really is meant to represent a float, 64bit-precision is enough and
all is well. Only I doubt the sanity of the var definition.
I will go and find out what really is behind these
numbers. The largest of these are something * 10^28...
In the meantime, thanks for looking into this.
|
Mon 10 Mar 2014 04:22:29 PM UTC, comment #5:
> An example number is
> 100000000000000032,
> exported by pspp to
> 1.0000000000000003e+17
PSPP is exporting this number with as many digits as required to fully express the internal precision. All of the numbers from 100000000000000030 to 100000000000000039 (and perhaps in a wider range) have exactly the same representation in the 64-bit form used in the computer. Adding a '2' at the end does not indicate an additional digit of precision; on the contrary, it is deceptive in that it claims some precision that is not there.
|
Mon 10 Mar 2014 04:02:29 PM UTC, comment #4:
Yes, shure.
Sorry for not posting all the details in the initial report,
I thought the problem more generic than it actually is.
An example number is
100000000000000032,
exported by pspp to
1.0000000000000003e+17
The file can be found here:
https://dbk.gesis.org/dbksearch/download.asp?db=E&id=34868
(Sorry, can't post it due to a licensing restriction.)
The offending number lives in variable q13b13, row 7499.
Looking at the data with psppire, I find the number represented
correctly.
|
Sun 09 Mar 2014 05:44:06 PM UTC, comment #3:
Can you give an example of a number for which precision is lost? Looking at the code that generates CSV and tab-delimited output, it should not lose any precision.
|
Tue 04 Mar 2014 08:34:34 AM UTC, comment #2:
Yes, shure:
I look at the csv export generated with
GET FILE="source.sav"
SAVE TRANSLATE
/TYPE=TAB
/FIELDNAMES
/OUTFILE="pspp_out.csv"
|
Fri 28 Feb 2014 06:55:05 PM UTC, comment #1:
Can you give an example? I understand that you have large numbers in your dataset, but I don't know which way you are using PSPP to view them.
|
Fri 28 Feb 2014 05:27:12 PM UTC, original submission:
I found a few datasets (ab)using very large numbers in the order of (e+15...e+19). Actually, they seem to contain codes that
should be strings in the first place. But oh well. SPSS handles
them properly, the width is given as e.g. 22.
While these numbers are handled by spss, I find them represented as numbers in e-notation, but not with full precision. So I lose
information and introduce errors (when expanding, the little
end digits' info is gone).
Some cursory searches got me nowhere, so apologies if this is a known issue. I tested on a recent nightly build from February 2014.
I think this should be easy to reproduce, so I do not give links to the offending datasets -- their license does not allow attaching them. If you think otherwise, please ping me.
Thank you!
|