11-15-2017, 07:43 AM
I'm using the Beta, but I suspect these issues are not release based.
The rules for specifying values in different bases are not consistent.
e.g. To specify a number in hex, octal, decimal and binary are #0h, #0o, #0d and #0b respectively.
From the command line, the lower case letters are used, but in programs, you need to use Upper case letters to define the base.
Also, I can't seem to specify the number of bits. I thought it was #0d32, but from the command line this is getting converted to #0d*32 and in programs it's getting converted to the hex number d32 (I think hex is in my settings somewhere)....
So, how do I specify the number of bits. I did see in the user manual #0:b in one spot, but I tried the colon and it didn't seem to work (from the command line).
help???
Thx
The rules for specifying values in different bases are not consistent.
e.g. To specify a number in hex, octal, decimal and binary are #0h, #0o, #0d and #0b respectively.
From the command line, the lower case letters are used, but in programs, you need to use Upper case letters to define the base.
Also, I can't seem to specify the number of bits. I thought it was #0d32, but from the command line this is getting converted to #0d*32 and in programs it's getting converted to the hex number d32 (I think hex is in my settings somewhere)....
So, how do I specify the number of bits. I did see in the user manual #0:b in one spot, but I tried the colon and it didn't seem to work (from the command line).
help???
Thx