15,512,645 members
See more:
Hi All,

I am having a value in Sql Server DB as 1.000. But when i am reading it to Datatable it is trimming to 2 digits as 1.00. How can i read the original value to Datatable as 1.000 instead of 1.00

By the way i am using .net 2.0 frame work only.

Thanks

What I have tried:

I have to tried to convert using todecimal, Math.Round etc. But does not work.
Posted
Updated 15-Jun-16 14:05pm

## Solution 3

I'd say use decimal as your data type on sql with a scale of 3 eg.
SQL
```DECLARE @number DECIMAL(16,3) = 1
SELECT @number```

Your result will return 1.000

Declare same data type variable on c# eg.
C#
`decimal number = 1.000m;//value from db`

the value will be returned as 1 because 1 and 1.000 is the same thing, but because you still want it with 3 decimal places you have to do string formating eg.
C#
`textBox1.Text = number.ToString("N3");`

This will return 1.000

v2

## Solution 1

The question makes no sense at all. 1.000 and 1.00 is the same exact number. Nothing is "trimmed". It looks like you need to understand the concept of "number", which you confuse with its textual representation. If so, it's pretty hard to help with that, it's a matter of fundamental, elementary education.

Now, to control string representation of numbers, formatting should be used. For example, please see:
Double.ToString Method (System),
Standard Numeric Format Strings,
Custom Numeric Format Strings.

—SA

Charles Shob 15-Jun-16 0:43am
Yes. As you said it is for textual representation and for calculations too. And I agree yes i am a beginner.
Sergey Alexandrovich Kryukov 15-Jun-16 2:24am
It's not a problem to be a beginner, but you need to focus on the fundamentals as much as possible. The notion of "number" is not even programming, this is mathematics, and the notion is really complicated, if you get to the depths of it.
All right, the string formatting gives you the freedom to control your textual representation. Will you accept the answer formally now?
—SA
an0ther1 15-Jun-16 18:04pm
1.00 is equal to 1.0000000000. The datatype of a column in a Datatable determines the precision (the number of decimal places the value is kept to) but you don't normally display them all except in certain circumstances.

Try with a different value - eg; 1.23456 or 0.1234 - what does that get converted to when you display it?
If you have 1 & then you add 0.001 & then display it does it still show as 1.00 or 1.001?

Whenever value is displayed it is converted to a string, hence some precision may be lost & there is a default conversion - Double.ToString(); is the same as Double.ToString("G"); Where as Double.ToString("F4") provides a different string value.
The links Sergey provided explain this.

Kind Regards

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Top Experts
Last 24hrsThis month
 OriginalGriff 185 Richard MacCutchan 100 Graeme_Grant 100 Richard Deeming 85 Dave Kreskowiak 70
 OriginalGriff 1,134 Richard Deeming 893 Richard MacCutchan 851 Dave Kreskowiak 280 merano99 195

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900