R.Wieser
2015-07-11 12:58:04 UTC
Hello All,
I'm reading values outof a file, and ran into troubles because the
decimal-point notation was different of that of my local settings (Europe
uses "," where the USA uses ".").
After some google-digging I found the SetLocal() command, and it works
well-ish.
The problem is that it only seems to affect other commands (like CDbl() ),
but *not* any automatic conversion (which is a "disaster" waiting to
happen). :-|
Example:
"3.4" + 1 => 35 (the dot is simply "forgotten")
CDbl("3.4") + 1 => 4,4
Question:
How do get the automatic conversions to honour the localisation-setting too
?
Further info:
I've embedded a scripting-object into a program of mine, so I've got quite a
bit of access to all its objects. I've already been playing with the
GetLCID command (returning different LCIDs), but can't seem to evoke any
changed response.
Regards,
Rudy Wieser
I'm reading values outof a file, and ran into troubles because the
decimal-point notation was different of that of my local settings (Europe
uses "," where the USA uses ".").
After some google-digging I found the SetLocal() command, and it works
well-ish.
The problem is that it only seems to affect other commands (like CDbl() ),
but *not* any automatic conversion (which is a "disaster" waiting to
happen). :-|
Example:
"3.4" + 1 => 35 (the dot is simply "forgotten")
CDbl("3.4") + 1 => 4,4
Question:
How do get the automatic conversions to honour the localisation-setting too
?
Further info:
I've embedded a scripting-object into a program of mine, so I've got quite a
bit of access to all its objects. I've already been playing with the
GetLCID command (returning different LCIDs), but can't seem to evoke any
changed response.
Regards,
Rudy Wieser