Wednesday, September 2, 2015

the difference between decimals and doubles in C#

The decimal has a capacity for being more precise. It's "better" while taking up more memory as an expense for being so. There is also a type called a float which is the same sort of thing yet crummier (less memory, less impressive) than either the decimal or the double. For more see:

No comments:

Post a Comment