Number – Talk

From Bohemia Interactive Community
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

I delete a reference to integer, because there is no such type in OFP scripting.

--Suma 11:36, 30 June 2006 (CEST)


Is anyone sure on the precision to which numbers can be stored as variables?

OFP example :

var1 = 123456789
var1 = var1 - 123000000
var1 will return 456792

As a non-programmer, I find this behaviour quite strange. --Ceeeb 08:03, 15 February 2007 (CET)


A test script:
player globalchat format["Precision Test: %1, %2, %3, %4, %5, %6, %7, %8, %9", 1234567 - 1234566, 1234567, 999998+1, 999998+2, 999998+3, (9999999+6)-(10999999), (10000000)-(10999999), 9999999/2 - 4999998, 9999999/2 - 4999998.4];

Which gives:
PRECISION TEST: 1, 1.23457E+006, 999999, 1E+006, 1E+006, -999994, -999999, 1.5, 1

This seems to suggest that you can only store a maximum value of 999999 in a variable. (6 digits)
But, you can use up to 8 digits in integer equations, provided the output is 6 digits or less.
Please do not take the above two statements as fact, since I have done only a few hours of limited testing.

--hellop 09:30, 11 July 2008 (PST)

Precision loss

There are problems with adding big numbers, for example (16 777 216 + 1) - 16 777 216 = 0 but it is 1... so you loss precision... The result describe from which number it is not possible to add without loss precision.

_value = 0; _array = [100000, 10000, 1000, 100, 10, 1]; for "_i" from 1 to 1000000000 step 1000000 do { { _value = (_i + _x) - _i; if (_value != _x) exitWith { for "_k" from (_i - 1000000) to _i step _x do { _value = (_k + _x) - _k; if (_value != _x) exitWith {diag_log format["precision loss from %1 with adding %2", [_k] call BIS_fnc_numberText, _x]}; }; _array = _array - [_x]; }; } forEach _array; };

Result:

  1. precision loss from 16 777 216 with adding 1
  2. precision loss from 33 554 440 with adding 10
  3. precision loss from 67 108 800 with adding 100
  4. precision loss from 134 218 000 with adding 1000
  5. precision loss from 268 440 000 with adding 10000
  6. precision loss from 536 800 000 with adding 100000

--Will (talk) 13:56, 28 May 2016 (CEST)