Is there any difference between DECIMAL and NUMERIC in SQL Server?

Sql ServerSqldatatypes

Sql Server Problem Overview


Is there any difference between DECIMAL and NUMERIC data types in SQL Server?

When should I use DECIMAL and when NUMERIC?

Sql Server Solutions


Solution 1 - Sql Server

They are the same. Numeric is functionally equivalent to decimal.

MSDN: decimal and numeric

Solution 2 - Sql Server

This is what then SQL2003 standard (§6.1 Data Types) says about the two:

 <exact numeric type> ::=
    NUMERIC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DECIMAL [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DEC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | SMALLINT
  | INTEGER
  | INT
  | BIGINT

 ...

21) NUMERIC specifies the data type
    exact numeric, with the decimal
    precision and scale specified by the
    <precision> and <scale>.
    
22) DECIMAL specifies the data type
    exact numeric, with the decimal scale
    specified by the <scale> and the
    implementation-defined decimal
    precision equal to or greater than the
    value of the specified <precision>.

Solution 3 - Sql Server

To my knowledge there is no difference between NUMERIC and DECIMAL data types. They are synonymous to each other and either one can be used. DECIMAL and NUMERIC data types are numeric data types with fixed precision and scale.

Edit:

Speaking to a few collegues maybe its has something to do with DECIMAL being the ANSI SQL standard and NUMERIC being one Mircosoft prefers as its more commonly found in programming languages. ...Maybe ;)

Solution 4 - Sql Server

Joakim Backman's answer is specific, but this may bring additional clarity to it.

There is a minor difference. As per SQL For Dummies, 8th Edition (2013):

> The DECIMAL data type is similar to NUMERIC. ... The difference is > that your implementation may specify a precision greater than what you > specify — if so, the implementation uses the greater precision. If you > do not specify precision or scale, the implementation uses default > values, as it does with the NUMERIC type.

It seems that the difference on some implementations of SQL is in data integrity. DECIMAL allows overflow from what is defined based on some system defaults, where as NUMERIC does not.

Solution 5 - Sql Server

They are synonyms, no difference at all.Decimal and Numeric data types are numeric data types with fixed precision and scale.

-- Initialize a variable, give it a data type and an initial value

declare @myvar as decimal(18,8) or numeric(18,8)----- 9 bytes needed

-- Increse that the vaue by 1

set @myvar = 123456.7

--Retrieve that value

select @myvar as myVariable

Solution 6 - Sql Server

They are exactly the same. When you use it be consistent. Use one of them in your database

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionDhanapalView Question on Stackoverflow
Solution 1 - Sql ServerGuffaView Answer on Stackoverflow
Solution 2 - Sql ServerJoakim BackmanView Answer on Stackoverflow
Solution 3 - Sql ServerkevchaddersView Answer on Stackoverflow
Solution 4 - Sql ServerAlex FirsovView Answer on Stackoverflow
Solution 5 - Sql ServerRiyaView Answer on Stackoverflow
Solution 6 - Sql ServerFarukView Answer on Stackoverflow