Numbers
Defaults
- A literal value with no decimal point generally defaults to an int,
- By default, when you specify a literal number with a decimal point, the compiler interprets it as a double type.
Console.WriteLine(1/3);
Console.WriteLine(1F/3);
Literal Suffix
F
: float
D
: double
M
: decimal
L
: long or ulong
U
: uint or ulong
UL
, LU
: ulong
Digit Separator
- Numbers can get quite large and difficult to read. To overcome the readability problem, C# 7.0 added support for a digit separator, an underscore (_), when expressing a numeric literal
- You can use the digit separator to create whatever grouping you like as long as the underscore occurs between the first and last digits.
Console.WriteLine(9_814_072_356M);
Console.WriteLine(98_140_72_356M);
- To display a numeric value in its hexadecimal format,
Console.WriteLine($"0x{42:X}");