This manifested as well in a different issue I just filed (http://entityframework.codeplex.com/workitem/734)
Created this test table:
CREATE TABLE [dbo].[DecimalTestingTable](
[Dec1] [decimal](6, 1) NULL,
[Dec2] [decimal](6, 2) NULL,
[Dec3] [decimal](6, 3) NULL,
[Dec4] [decimal](6, 4) NULL,
[Dec5] [decimal](6, 5) NULL,
[NN1] [decimal](6, 1) NOT NULL,
[NN2] [decimal](6, 2) NOT NULL,
[NN3] [decimal](6, 3) NOT NULL,
[NN4] [decimal](6, 4) NOT NULL,
[NN5] [decimal](6, 5) NOT NULL
)
Inserting these values via manual SQL:
insert into DecimalTestingTable values
(
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888
)
This results in SQL rounding when it writes the column values, such that selecting back out of the table gives these values:
Dec1 Dec2 Dec3 Dec4 Dec5 NN1 NN2 NN3 NN4 NN5
0.9 0.89 0.889 0.8889 0.88889 0.9 0.89 0.889 0.8889 0.88889
However, if I have a model (code first in this case, although I would guess it doesn't matter) that includes the right precision/scale info for the columns, the values are modified on the SQL client side instead of being sent 'whole'/unmodified to SQL server.
var context = new DecimalTestingContext();
context.DecimalTestingTables.Add(new DecimalTestingTable()
{
Dec1 = 0.88888888M,
Dec2 = 0.88888888M,
Dec3 = 0.88888888M,
Dec4 = 0.88888888M,
Dec5 = 0.88888888M,
NN1 = 0.88888888M,
NN2 = 0.88888888M,
NN3 = 0.88888888M,
NN4 = 0.88888888M,
NN5 = 0.88888888M,
});
context.SaveChanges();
If you don't modify the reverse-engineered model, then you'll get this generated SQL (this is what 734 is about, just including here in case others try to repro this and see this SQL)
exec sp_executesql N'insert [dbo].[DecimalTestingTable]([NN1], [NN2], [NN3], [NN4], [NN5], [Dec1], [Dec2], [Dec3], [Dec4], [Dec5])
values (@0, @1, @2, @3, @4, @5, @6, @7, @8, @9)
',N'@0 decimal(18,2),@1 decimal(18,2),@2 decimal(18,2),@3 decimal(18,2),@4 decimal(18,2),@5 decimal(18,2),@6 decimal(18,2),@7 decimal(18,2),@8 decimal(18,2),@9 decimal(18,2)',@0=0.88,@1=0.88,@2=0.88,@3=0.88,@4=0.88,@5=0.88,@6=0.88,@7=0.88,@8=0.88,@9=0.88
If the model includes the correct precision/scale information like this:
this.Property(t => t.Dec1).HasColumnName("Dec1").HasPrecision(6, 1);
this.Property(t => t.Dec2).HasColumnName("Dec2").HasPrecision(6, 2);
this.Property(t => t.Dec3).HasColumnName("Dec3").HasPrecision(6, 3);
this.Property(t => t.Dec4).HasColumnName("Dec4").HasPrecision(6, 4);
this.Property(t => t.Dec5).HasColumnName("Dec5").HasPrecision(6, 5);
this.Property(t => t.NN1).HasColumnName("NN1").HasPrecision(6, 1);
this.Property(t => t.NN2).HasColumnName("NN2").HasPrecision(6, 2);
this.Property(t => t.NN3).HasColumnName("NN3").HasPrecision(6, 3);
this.Property(t => t.NN4).HasColumnName("NN4").HasPrecision(6, 4);
this.Property(t => t.NN5).HasColumnName("NN5").HasPrecision(6, 5);
then we still end up with SQL-client-side truncation of the value instead of either 1) rounding like SQL Server will do (which might need to check what 'mode' SQL is in, assuming this behavior is settable) or 2) (IMHO, preferred) just sending the unmodified value through to the server so we get whatever the server-defined behavior is (by default, appears to be rounding).
I'd imagine modifying the value on the client side before sending it to the server is done with a goal of keeping the client and server in sync with what value is stored without re-querying the just-inserted-or-updated data, so keeping that behavior might be necessary, but if so, it seems like worst-case the model should support including the truncation-versus-rounding behavior so I could either use a new overload of HasPrecision that accepted a 'bool shouldRoundInsteadOfTruncate' or a new model configuration .ShouldRoundToScale() (or whatever it would be called) I could add after the HasPrecision.
My apologies if the 'round instead of truncate when modifying the value before sending to the server' behavior is already configurable and I just missed it. :)
Thanks!
Comments: verified, closing
Created this test table:
CREATE TABLE [dbo].[DecimalTestingTable](
[Dec1] [decimal](6, 1) NULL,
[Dec2] [decimal](6, 2) NULL,
[Dec3] [decimal](6, 3) NULL,
[Dec4] [decimal](6, 4) NULL,
[Dec5] [decimal](6, 5) NULL,
[NN1] [decimal](6, 1) NOT NULL,
[NN2] [decimal](6, 2) NOT NULL,
[NN3] [decimal](6, 3) NOT NULL,
[NN4] [decimal](6, 4) NOT NULL,
[NN5] [decimal](6, 5) NOT NULL
)
Inserting these values via manual SQL:
insert into DecimalTestingTable values
(
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888,
0.88888888
)
This results in SQL rounding when it writes the column values, such that selecting back out of the table gives these values:
Dec1 Dec2 Dec3 Dec4 Dec5 NN1 NN2 NN3 NN4 NN5
0.9 0.89 0.889 0.8889 0.88889 0.9 0.89 0.889 0.8889 0.88889
However, if I have a model (code first in this case, although I would guess it doesn't matter) that includes the right precision/scale info for the columns, the values are modified on the SQL client side instead of being sent 'whole'/unmodified to SQL server.
var context = new DecimalTestingContext();
context.DecimalTestingTables.Add(new DecimalTestingTable()
{
Dec1 = 0.88888888M,
Dec2 = 0.88888888M,
Dec3 = 0.88888888M,
Dec4 = 0.88888888M,
Dec5 = 0.88888888M,
NN1 = 0.88888888M,
NN2 = 0.88888888M,
NN3 = 0.88888888M,
NN4 = 0.88888888M,
NN5 = 0.88888888M,
});
context.SaveChanges();
If you don't modify the reverse-engineered model, then you'll get this generated SQL (this is what 734 is about, just including here in case others try to repro this and see this SQL)
exec sp_executesql N'insert [dbo].[DecimalTestingTable]([NN1], [NN2], [NN3], [NN4], [NN5], [Dec1], [Dec2], [Dec3], [Dec4], [Dec5])
values (@0, @1, @2, @3, @4, @5, @6, @7, @8, @9)
',N'@0 decimal(18,2),@1 decimal(18,2),@2 decimal(18,2),@3 decimal(18,2),@4 decimal(18,2),@5 decimal(18,2),@6 decimal(18,2),@7 decimal(18,2),@8 decimal(18,2),@9 decimal(18,2)',@0=0.88,@1=0.88,@2=0.88,@3=0.88,@4=0.88,@5=0.88,@6=0.88,@7=0.88,@8=0.88,@9=0.88
If the model includes the correct precision/scale information like this:
this.Property(t => t.Dec1).HasColumnName("Dec1").HasPrecision(6, 1);
this.Property(t => t.Dec2).HasColumnName("Dec2").HasPrecision(6, 2);
this.Property(t => t.Dec3).HasColumnName("Dec3").HasPrecision(6, 3);
this.Property(t => t.Dec4).HasColumnName("Dec4").HasPrecision(6, 4);
this.Property(t => t.Dec5).HasColumnName("Dec5").HasPrecision(6, 5);
this.Property(t => t.NN1).HasColumnName("NN1").HasPrecision(6, 1);
this.Property(t => t.NN2).HasColumnName("NN2").HasPrecision(6, 2);
this.Property(t => t.NN3).HasColumnName("NN3").HasPrecision(6, 3);
this.Property(t => t.NN4).HasColumnName("NN4").HasPrecision(6, 4);
this.Property(t => t.NN5).HasColumnName("NN5").HasPrecision(6, 5);
then we still end up with SQL-client-side truncation of the value instead of either 1) rounding like SQL Server will do (which might need to check what 'mode' SQL is in, assuming this behavior is settable) or 2) (IMHO, preferred) just sending the unmodified value through to the server so we get whatever the server-defined behavior is (by default, appears to be rounding).
I'd imagine modifying the value on the client side before sending it to the server is done with a goal of keeping the client and server in sync with what value is stored without re-querying the just-inserted-or-updated data, so keeping that behavior might be necessary, but if so, it seems like worst-case the model should support including the truncation-versus-rounding behavior so I could either use a new overload of HasPrecision that accepted a 'bool shouldRoundInsteadOfTruncate' or a new model configuration .ShouldRoundToScale() (or whatever it would be called) I could add after the HasPrecision.
My apologies if the 'round instead of truncate when modifying the value before sending to the server' behavior is already configurable and I just missed it. :)
Thanks!
Comments: verified, closing