Tuesday, March 27, 2012
Data File Growth operations
performance hit?
It appears from a trace of mine that [during a data file growth] some (the
majority), but not all of my inserts are timing out. However, all updates
succeed, no updates time out. So, it seems I can partially answer my
question above, I think? Just looking for some clarification.
--
Message posted via http://www.sqlmonster.comAny operation that depends on a new extent allocation will suspend. If you
are inserting on a non-full page, it will work. In-place updates will work.
New rows that trigger an extent allocation will suspend.
You get the picture.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Robert Richards via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:6bedf6ba67e1467f933788e8e0e9fad3@.SQLMonster.com...
> Are any operations suspended during a data file growth? Or is it just a
> performance hit?
> It appears from a trace of mine that [during a data file growth] some (the
> majority), but not all of my inserts are timing out. However, all updates
> succeed, no updates time out. So, it seems I can partially answer my
> question above, I think? Just looking for some clarification.
> --
> Message posted via http://www.sqlmonster.com|||Thanks Geoff. That is all clear except for when a 10% automatic growth
takes place.
My inserts all succeed in milliseconds outside of a data file growth.
During data file growth the applications initiating the inserts time out at
30 seconds. While some inserts are timing out at 30 seconds, during this
same time, the ones that are succeeding complete in milliseconds.
To try to clarify, if I have an approximate ratio of 10:1, where 10 equals
inserts that timeout for every 1 insert that succeeds, during a data file
growth ( the data file is set to automatically grow by 10% ) what is
happening is:
1. New extent allocations are taking place due to the 10% automatic growth.
2. The inserts that are succeeding are the ones going on a non-full page.
3. The inserts that are failing (timing out) are either triggering an
extent allocation, or are waiting for the extent allocation due to the 10%
automatic growth.
Am I clear in my understanding?
--
Message posted via http://www.sqlmonster.com|||You have a good grasp of the concepts. The solution is to not rely on
automatic growth. I use it as an emergency safety valve only. I
pro-actively manage the data space used in each database/filegroup so my
production systems never have to pause while they grow. Also, 10% is OK for
about 60-70% of databases. Many databases are too small or too large for
percentage growth to be effective (Think about a 2MB database. Now think
about a 300GB database.) I prefer to set an absolute amount to grow by,
just in case. That way, I know exactly how long it should take for every
auto-grow even though I hope not to use it.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Robert Richards via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:e5a3374e183c4893a73356c7666d88b3@.SQLMonster.com...
> Thanks Geoff. That is all clear except for when a 10% automatic growth
> takes place.
> My inserts all succeed in milliseconds outside of a data file growth.
> During data file growth the applications initiating the inserts time out
at
> 30 seconds. While some inserts are timing out at 30 seconds, during this
> same time, the ones that are succeeding complete in milliseconds.
> To try to clarify, if I have an approximate ratio of 10:1, where 10 equals
> inserts that timeout for every 1 insert that succeeds, during a data file
> growth ( the data file is set to automatically grow by 10% ) what is
> happening is:
> 1. New extent allocations are taking place due to the 10% automatic
growth.
> 2. The inserts that are succeeding are the ones going on a non-full page.
> 3. The inserts that are failing (timing out) are either triggering an
> extent allocation, or are waiting for the extent allocation due to the 10%
> automatic growth.
> Am I clear in my understanding?
> --
> Message posted via http://www.sqlmonster.com|||In light of this, I need further understanding.
When I manually expand the data file 1 GB, the expansion takes
approximately 2 minutes. When it automatically grows, and inserts are
timing out at the same time, the expansion takes approimately 20 minutes,
for 1 GB (the 10% equivalent).
How does all this we have been discussing equate to the increased expansion
time?
--
Message posted via http://www.sqlmonster.com|||It may be that the client operation that triggered the specific expansion
has timed out. This may result in the expansion being rolled back. It
probably repeats until the expansio happens for a transaction that doesn't
time out. Whatever causes it, you now have an even stronger reason to avoid
auto-grow.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Robert Richards via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:bb7f723949324ec5886d55b0e29ea65a@.SQLMonster.com...
> In light of this, I need further understanding.
> When I manually expand the data file 1 GB, the expansion takes
> approximately 2 minutes. When it automatically grows, and inserts are
> timing out at the same time, the expansion takes approimately 20 minutes,
> for 1 GB (the 10% equivalent).
> How does all this we have been discussing equate to the increased
expansion
> time?
> --
> Message posted via http://www.sqlmonster.com
Monday, March 19, 2012
Data Conversion Numeric to date
I have some data which I am trying to put into a DM where I can use it as part of a cube (my first!!)
I have hit a small problem with dates, I get it from the ERP system as a numeric field, and I need to convert it to a date format. The intension is to use this converted data with Named Calculations to derive Year, month Day ect.
However I cannot seem to be able to convert and store (in SQL) this column can anyone advise
Thanks
The source column is called AHDATE and a value is 60703 which I am told is YY/MM/DD and using a data conversion componant transforming to data type date[DT_DATE] I get 12/03/2066 00:00:00
Which is way out
Thanks for any help
Robbie
|||60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.
e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)
Something like that anyway. I haven't test it.
-Jamie
|||
Jamie Thomson wrote:
60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.
e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)
Something like that anyway. I haven't test it.
-Jamie
Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?
At the moment I am trying to store it as a date format should i be storing it as just a char or some thing
thanks again
|||Blackuke wrote:
Jamie Thomson wrote: 60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.
e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)
Something like that anyway. I haven't test it.
-Jamie
Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?
At the moment I am trying to store it as a date format should i be storing it as just a char or some thing
thanks again
Store it however you like. That's up to you. As is the data type of the output column from the Derived Column component.
If you want to store it in a datetime field, cast it as a datetime value. You can do this within the Derived Column component.
-Jamie
|||
Blackuke wrote:
Jamie Thomson wrote: 60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.
e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)
Something like that anyway. I haven't test it.
-Jamie
Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?
At the moment I am trying to store it as a date format should i be storing it as just a char or some thing
thanks again
So much for this beeing easy, you make it sound so!!
I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)
thanks
|||
Blackuke wrote:
So much for this beeing easy, you make it sound so!!
I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)
thanks
yeah my bad, sorry. Its just a matter of counting brackets though. Try this instead:
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],4,2)
rather than this which is what it was before (I've highlighted the offending characters in blue):
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)
-Jamie
|||
Thanks guys, I have done my goal for today and that is import the data into SQL 2005 as a date which i have done thanks to your help.
I used a derived Column using the followin expression
"200" + SUBSTRING((DT_STR,5,1252)AHDATE,1,1) + "-" + SUBSTRING((DT_STR,5,1252)AHDATE,2,2) + "-" + SUBSTRING((DT_STR,5,1252)AHDATE,4,2)
I selected database timestamp as the data type and on the SQL 2005 table itself, on the date column I changed the data type to datetime
This was the only way I could get it to work.
Thanks guys for pointing me in the right direction, where shall i send the beer
(If you have a Exchange 2007 Question I am your man)!!!
|||Jamie Thomson wrote:
Blackuke wrote: So much for this beeing easy, you make it sound so!!
I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)
thanks
yeah my bad, sorry. Its just a matter of counting brackets though. Try this instead:
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],4,2)
rather than this which is what it was before (I've highlighted the offending characters in blue):
"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)
-Jamie
Thanks, i managed to google around to find it out, thanks for the help, no doubt you will see more of me!!!!
|||Hi Jamie
It worked great until I put in the historical data
for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03
however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import
is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's
Please help
Thanks again
|||
Hi Jamie
It worked great until I put in the historical data
for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03
however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import
is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's
Please help
Thanks again
|||Blackuke wrote:
Hi Jamie
It worked great until I put in the historical data
for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03
however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import
is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's
Please help
Thanks again
Yes.
I'll point you in the right direction but you're gonna have to piece this together yourself I'm afraid. I'm busy :)
The conditional operator will help you out:
http://msdn2.microsoft.com/en-us/sql/ms141680.aspx
-Jamie
|||
Thanks I appreciate it
|||
I could not get it working, so I decided to do 3 conditional splits and then the derived columns.
Seems to be working, now just got to work out how to do a time dimension, I have the invoice date in the fact table I think I have already made a mistake.
Thanks again