Showing posts with label conversion. Show all posts
Showing posts with label conversion. Show all posts

Monday, March 19, 2012

Data Conversion supported on Standard Edition?

I created an Integration Services Package that runs fine from my local computer using BIDS. However when I imported into our SQL Server and try to run it from there I get the following error:

DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Data Conversion"

We are running SQL Server 2005 Standard Edition 64-bit.

We have integration services installed on the server. Is data conversion something that is not supported on Standard Edition?

Also have a similar message for "Send Mail Task."

Is there anywhere that outlines what features are supported on each version?

What version of SQL Server are you running on your developer's workstation?|||

I'm running BIDS on my workstation which is connecting to our SQL Server. The same SQL Server where the integration package will not run from when imported into. So I'm actually not running a full blown SQL Server database on my workstation, just using BIDS on it along with Studio.

The Management Studio on my workstation is 9.00.3042.

SQL Server Integration Services is 9.00.3042. (Found by going to About > Help in Visual Studio)

Our SQL Server version is 9.00.3050

|||So both your workstation and the server are running SQL Server Standard edition? Not developer/enterprise edition?|||

Server is definitely Standard Edition. When I installed the workstation components on my workstation I honestly don't remember if I installed them from the Standard edition or Developer edition. Most likely I installed the workstation components using the Developer edition. Is there a way to check?

I guess that would explain why it works on my workstation but not on the server?

|||

Erikk Ross wrote:

Server is definitely Standard Edition. When I installed the workstation components on my workstation I honestly don't remember if I installed them from the Standard edition or Developer edition. Most likely I installed the workstation components using the Developer edition. Is there a way to check?

Run this query on the server and again on your local version:

SELECT SERVERPROPERTY('productversion'), SERVERPROPERTY ('productlevel'), SERVERPROPERTY ('edition')|||

Yeah, like I said I don't have the actual Database engine installed on my workstation. But our server version is SP2: 9.00.3050.

So I guess Data Conversion is not supported in Standard Edition? Or send mail? Is there a matrix somewhere that shows what features in Integration Services are supported in each version of SQL Server? It would seem to me that data conversion is something that is pretty common...I'm a little suprised it would require the Enterprise edition.

|||Alright, well, I do think it's because you're developing a package in a Developer environment, which is a higher level than the Standard edition that your server runs. If your server was an Enterprise version, you wouldn't have a problem (of course).

Uninstall and reinstall the SSIS components from the Standard Edition CDs and you should be fine. The Data Conversion component should work in the Standard Edition.|||

Phil Brammer wrote:

Alright, well, I do think it's because you're developing a package in a Developer environment, which is a higher level than the Standard edition that your server runs. If your server was an Enterprise version, you wouldn't have a problem (of course).

Uninstall and reinstall the SSIS components from the Standard Edition CDs and you should be fine. The Data Conversion component should work in the Standard Edition.

That was indeed the solution. I just did a quick test and creating the package from a Standard edition version did allow the data conversion to work correctly. Thank you!!

|||

Well, after uninstalling SQL Server workstation components on my development machine and reinstalling the Standard version it did not fix my problem. Any package created on my local development machine still does not work when imported into SQL Server. Get the same Product level to low error.

At least the good news is that I can create packages on the SQL Server itself and they seem to work fine. My best guess is that uninstalling the developer edition and reinstalling the standard edition just wasn't enough. I would suspect that if I was to completely wipe my machine clean then install Standard Edition it would probably be ok. But that is more hassle than it's worth.

|||

Erikk Ross wrote:

Well, after uninstalling SQL Server workstation components on my development machine and reinstalling the Standard version it did not fix my problem. Any package created on my local development machine still does not work when imported into SQL Server. Get the same Product level to low error.

At least the good news is that I can create packages on the SQL Server itself and they seem to work fine. My best guess is that uninstalling the developer edition and reinstalling the standard edition just wasn't enough. I would suspect that if I was to completely wipe my machine clean then install Standard Edition it would probably be ok. But that is more hassle than it's worth.

Any NEW packages created still don't work?|||

Correct, I created a brand new package. Well I first tried to rebuild my old package but that didn't work either, so I just created a new one. Same problems, runs fine on my local workstation, but when imported into SQL Server gives the same error.

I am currently installing standard edition workstation components on a new workstation with Visual Studio and will try to create a package from there and import it and see if that works.

|||Run this on the server, please:
SELECT SERVERPROPERTY('productversion'), SERVERPROPERTY ('productlevel'), SERVERPROPERTY ('edition')|||

Phil Brammer wrote:

Run this on the server, please:
SELECT SERVERPROPERTY('productversion'), SERVERPROPERTY ('productlevel'), SERVERPROPERTY ('edition')

9.00.3050.00 SP2 Standard Edition (64-bit)

Well now I am completely lost. I just installed the standard edition workstation components on a brand new machine. A machine that has never had SQL Server installed on it. I created a new package in Visual Studio, imported it into our SQL Server, ran it, and still get the same ProductLevelToLow errors.

Error: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Data Conversion".

|||

After reading this I realized my problem: http://blogs.msdn.com/michen/archive/2006/08/11/package-exec-location.aspx

I was running the package from my workstation using Management Studio, unaware that the package was actually running on my local workstation and not on the server. Since it was running on my workstation it required Integration Services to be installed which it wasn't, so that is why I got those error messages. As soon as I ran it directly from the server it worked fine.

*sigh* I wish this was more obvious. I thought that by running it in Management Studio it was just automatically running it on the server.

Data Conversion Problem

I have tried the following in all kinds of combinations but cannot get it to
work. At this point I am not seeing the problem straight and need new eyes
to guide me.
CREATE TABLE EmpEvals
(
last_name varchar(25),
first_name varchar(25),
begin_dt datetime,
adj_beg_dt datetime,
termination_date datetime,
eval_months int
)
INSERT INTO EmpEvals (last_name, first_name, begin_dt, adj_beg_dt,
termination_date, eval_months )
SELECT eval_months = CAST(DATEDIFF(m, begin_dt, GETDATE() ) as int ),
last_name, first_name, begin_dt, adj_beg_dt, termination_date
FROM employee
I am getting an error going from datetime to int with the CONVERT/CAST in
the SELECT statement. I've tried both CONVERT and CAST and just cannot get
the CONVERT/CAST the way SQL Server wants it. I even tried changing
eval_months to datetime and it complained. Someone PLEASE help before I have
no hair left to pull out!
TIA
MikeOn Wed, 28 Jun 2006 08:08:52 -0400, "Mike" <mavila@.shoremortgage.com>
wrote:

>INSERT INTO EmpEvals (last_name, first_name, begin_dt, adj_beg_dt,
>termination_date, eval_months )
>SELECT eval_months = CAST(DATEDIFF(m, begin_dt, GETDATE() ) as int ),
>last_name, first_name, begin_dt, adj_beg_dt, termination_date
>FROM employee
The order of the column list of the INSERT must match the order of the
column list of the SELECT. In the code above, last_name is getting
the data for eval_months, first_name is getting last_name, etc.
Roy Harvey
Beacon Falls, CT|||Hi Mike
DATEDIFF returns an INT so there should be no problem with conversion. In
fact, you shouldn't even need the CAST at all.
I think the problem is that you are returning the value for eval_ months
first in your select, but it is the last column in the table. So all your
select values are trying to go into the wrong columns, and there are
conversion errors.
Using eval_months in your SELECT only gives a column header to the result,
it does not map it to a column of the table you are inserting into. You cold
give it any column name you want and it would be ignored, since you are
inserting into a table, and not returning the SELECT result to the client.
Try this:
INSERT INTO EmpEvals (last_name, first_name, begin_dt, adj_beg_dt,
termination_date, eval_months )
SELECT last_name, first_name, begin_dt, adj_beg_dt, termination_date,
DATEDIFF(m, begin_dt, GETDATE() )
FROM employee
HTH
Kalen Delaney, SQL Server MVP
"Mike" <mavila@.shoremortgage.com> wrote in message
news:Oc9REwqmGHA.5100@.TK2MSFTNGP04.phx.gbl...
>I have tried the following in all kinds of combinations but cannot get it
>to work. At this point I am not seeing the problem straight and need new
>eyes to guide me.
>
> CREATE TABLE EmpEvals
> (
> last_name varchar(25),
> first_name varchar(25),
> begin_dt datetime,
> adj_beg_dt datetime,
> termination_date datetime,
> eval_months int
> )
> INSERT INTO EmpEvals (last_name, first_name, begin_dt, adj_beg_dt,
> termination_date, eval_months )
> SELECT eval_months = CAST(DATEDIFF(m, begin_dt, GETDATE() ) as int ),
> last_name, first_name, begin_dt, adj_beg_dt, termination_date
> FROM employee
>
> I am getting an error going from datetime to int with the CONVERT/CAST in
> the SELECT statement. I've tried both CONVERT and CAST and just cannot get
> the CONVERT/CAST the way SQL Server wants it. I even tried changing
> eval_months to datetime and it complained. Someone PLEASE help before I
> have no hair left to pull out!
> TIA
> Mike
>|||Mike
How about to move an eval_months column at the beginning of the columns
list?
INSERT INTO EmpEvals (eval_months ,last_name, first_name, begin_dt,
adj_beg_dt,
termination_date, eval_months )
SELECT eval_months = CAST(DATEDIFF(m, begin_dt, GETDATE() ) as int ),
last_name, first_name, begin_dt, adj_beg_dt, termination_date
FROM employee
"Mike" <mavila@.shoremortgage.com> wrote in message
news:Oc9REwqmGHA.5100@.TK2MSFTNGP04.phx.gbl...
>I have tried the following in all kinds of combinations but cannot get it
>to work. At this point I am not seeing the problem straight and need new
>eyes to guide me.
>
> CREATE TABLE EmpEvals
> (
> last_name varchar(25),
> first_name varchar(25),
> begin_dt datetime,
> adj_beg_dt datetime,
> termination_date datetime,
> eval_months int
> )
> INSERT INTO EmpEvals (last_name, first_name, begin_dt, adj_beg_dt,
> termination_date, eval_months )
> SELECT eval_months = CAST(DATEDIFF(m, begin_dt, GETDATE() ) as int ),
> last_name, first_name, begin_dt, adj_beg_dt, termination_date
> FROM employee
>
> I am getting an error going from datetime to int with the CONVERT/CAST in
> the SELECT statement. I've tried both CONVERT and CAST and just cannot get
> the CONVERT/CAST the way SQL Server wants it. I even tried changing
> eval_months to datetime and it complained. Someone PLEASE help before I
> have no hair left to pull out!
> TIA
> Mike
>|||BLESS YOU!! That worked nicely. I need a vacation. I'm getting by
facts.
Thanks.
Mike
"Roy Harvey" <roy_harvey@.snet.net> wrote in message
news:urs4a21tmcnc7kj680mk4vjbp4ip1d7bfb@.
4ax.com...
> On Wed, 28 Jun 2006 08:08:52 -0400, "Mike" <mavila@.shoremortgage.com>
> wrote:
>
> The order of the column list of the INSERT must match the order of the
> column list of the SELECT. In the code above, last_name is getting
> the data for eval_months, first_name is getting last_name, etc.
> Roy Harvey
> Beacon Falls, CT

Data Conversion Numeric to date

I have some data which I am trying to put into a DM where I can use it as part of a cube (my first!!)

I have hit a small problem with dates, I get it from the ERP system as a numeric field, and I need to convert it to a date format. The intension is to use this converted data with Named Calculations to derive Year, month Day ect.

However I cannot seem to be able to convert and store (in SQL) this column can anyone advise

Thanks

The source column is called AHDATE and a value is 60703 which I am told is YY/MM/DD and using a data conversion componant transforming to data type date[DT_DATE] I get 12/03/2066 00:00:00

Which is way out

Thanks for any help

Robbie

|||

60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.

e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)

Something like that anyway. I haven't test it.

-Jamie

|||

Jamie Thomson wrote:

60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.

e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)

Something like that anyway. I haven't test it.

-Jamie

Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?

At the moment I am trying to store it as a date format should i be storing it as just a char or some thing

thanks again

|||

Blackuke wrote:

Jamie Thomson wrote:

60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.

e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)

Something like that anyway. I haven't test it.

-Jamie

Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?

At the moment I am trying to store it as a date format should i be storing it as just a char or some thing

thanks again

Store it however you like. That's up to you. As is the data type of the output column from the Derived Column component.

If you want to store it in a datetime field, cast it as a datetime value. You can do this within the Derived Column component.

-Jamie

|||

Blackuke wrote:

Jamie Thomson wrote:

60703 is not a literal that can be successfully converted to a date. "6" is not a year. You'll have to do something cleverer to work this out. i.e. Inside a Derived Column component extract all the different parts and concatenate them together to get what you are after.

e.g. "200" + SUBSTRING((DT_STR, 5, 1252)[columnname]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[columnname]),4,2)

Something like that anyway. I haven't test it.

-Jamie

Thanks for the help, I think half my battle is how I am trying to store it, what should the data type be in the derived column and also on the column of the table i am trying to store it in?

At the moment I am trying to store it as a date format should i be storing it as just a char or some thing

thanks again

So much for this beeing easy, you make it sound so!!

I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)

thanks

|||

Blackuke wrote:

So much for this beeing easy, you make it sound so!!

I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)

thanks

yeah my bad, sorry. Its just a matter of counting brackets though. Try this instead:

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],4,2)

rather than this which is what it was before (I've highlighted the offending characters in blue):

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)

-Jamie

|||

Thanks guys, I have done my goal for today and that is import the data into SQL 2005 as a date which i have done thanks to your help.

I used a derived Column using the followin expression

"200" + SUBSTRING((DT_STR,5,1252)AHDATE,1,1) + "-" + SUBSTRING((DT_STR,5,1252)AHDATE,2,2) + "-" + SUBSTRING((DT_STR,5,1252)AHDATE,4,2)

I selected database timestamp as the data type and on the SQL 2005 table itself, on the date column I changed the data type to datetime

This was the only way I could get it to work.

Thanks guys for pointing me in the right direction, where shall i send the beer

(If you have a Exchange 2007 Question I am your man)!!!

|||

Jamie Thomson wrote:

Blackuke wrote:

So much for this beeing easy, you make it sound so!!

I put the following in the derived column expression feild but it remain s red saying the function subtring requires 3 parameters not 1 bla bla bla, I have no idea can someone advise

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)

thanks

yeah my bad, sorry. Its just a matter of counting brackets though. Try this instead:

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE],4,2)

rather than this which is what it was before (I've highlighted the offending characters in blue):

"200" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),1,1) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),2,2) + "-" + SUBSTRING((DT_STR, 5, 1252)[AHDATE]),4,2)

-Jamie

Thanks, i managed to google around to find it out, thanks for the help, no doubt you will see more of me!!!!

|||

Hi Jamie

It worked great until I put in the historical data

for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03

however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import

is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's

Please help

Thanks again

|||

Hi Jamie

It worked great until I put in the historical data

for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03

however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import

is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's

Please help

Thanks again

|||

Blackuke wrote:

Hi Jamie

It worked great until I put in the historical data

for data that was like 61103 (YMMDD) the expression worked great I got 2006/11/03

however when I got data from 2000 it was like this, 110 (M/DD) (eg 2006/01/10) this broke the import

is there a way to say that there are 6 digits, the first needs to be a 2 and all the missing digits need to be 0's

Please help

Thanks again

Yes.

I'll point you in the right direction but you're gonna have to piece this together yourself I'm afraid. I'm busy :)

The conditional operator will help you out:

http://msdn2.microsoft.com/en-us/sql/ms141680.aspx

-Jamie

|||

Thanks I appreciate it

|||

I could not get it working, so I decided to do 3 conditional splits and then the derived columns.

Seems to be working, now just got to work out how to do a time dimension, I have the invoice date in the fact table I think I have already made a mistake.

Thanks again

Data Conversion Issue

Hi,

I have a simple query that does the following :-

select desc1,desc2,desc3,desc4,desc5 from testdata

So I select the data for the above

What I would like to achieve without having to go to great lengths the
following:-

So taking the column desc1 I want to insert this into a table

e.g.

Desc1 is record 1 in the table
Desc2 is record 2 in the table
Desc3 is record 3 in the table
Desc4 is record 4 in the table
and so on

Is there a tool or a special type declaration that can be used ?

Regards
Andrew[posted and mailed, please reply in news]

Info (info@.schnof.co.uk) writes:
> I have a simple query that does the following :-
> select desc1,desc2,desc3,desc4,desc5 from testdata
>
> So I select the data for the above
> What I would like to achieve without having to go to great lengths the
> following:-
> So taking the column desc1 I want to insert this into a table
> e.g.
> Desc1 is record 1 in the table
> Desc2 is record 2 in the table
> Desc3 is record 3 in the table
> Desc4 is record 4 in the table
> and so on

I'm afraid that I don't really understand what you are asking for. Could
you provide the following:

o CREATE TABLE statement for your table(s).
o INSERT statement with sample data.
o The desired result from this sample data.

The reason I ask for this is that it clarifies your question, and with
the script it's possible to post a tested solution.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Data conversion inserting to DB2 on AS400 with SSIS

I created a SSIS package moving data from a SQL 2005 table to an existing DB2 table on AS400 using Microsoft OLE DB Provider for DB2.

When the package was run, it showed that rows were successfully inserted to DB2. However, the data didn't seem to be converted correctly. Most of the string values were inserted as unusual characters. Also any string values of digits were not inserted.

For example, 1.) a character field (char(1) or nchar(1) as I have tried both types) in SQL 2005 table with a simple value of 'H' was inserted into the DB2 table field of type "A" (alphanumeric) of length 1 as '?' and others letters were inserted as other unusual characters. 2.) A string value of '00100' in SQL Server is not inserted to DB2 table at all.

Later we found that the fields inserted with usual characters are difined as CSSID =65535. A few fields with correct data inserted have CSSID=00037.

Does anyone know why this happened and how to solve this to get the data inserted correctly in the DB2 table?

Thanks in advance for any help!

Try to set the appropriate LocaleID and DefaultCodePage on your destination component.

Thanks.

|||

The LocaleID and DefaultCodePage were set correctly.

What I found was in the Data Link Properties of connection manager, the "Host CCSID" was set to "OEM - United States [437]". I changed it to "EBCDIC - US/Canada [37]" and it started working perfectly.

Thank you for your suggestion though.

Data Conversion in SSIS

what is the use of Data Conversion

please give me an example

Hi

BOL: "The Data Conversion transformation converts the data in an input column to a different data type and then copies it to a new output column. For example, a package can extract data from multiple sources, and then use this transformation to convert columns to the data type required by the destination data store. You can apply multiple conversions to a single input column. "

For example, I used it when the DataSource date column was treated as a string , but I needed it to be converted to Datatime , or input column was INT datatype, but I needed to populate BIGINT destination column.

data conversion in during DTS import

Hi,
I have a text file in which one field is a date but in
file it is in '20030818' format and I am tring to import
that file to a table which has a column datetime.
is it possible to convert '20030818' to datetime during
DTS import?
if yes please let me know how ?
Thanks.use
1 .vbscript tranformation(activex)
2 .select convert(datetime, '20030818')
3. dts datetime conversion facility
rohan
>--Original Message--
>Hi,
>I have a text file in which one field is a date but in
>file it is in '20030818' format and I am tring to import
>that file to a table which has a column datetime.
>is it possible to convert '20030818' to datetime during
>DTS import?
>if yes please let me know how ?
>Thanks.
>.
>|||use
1 .vbscript tranformation(activex)
2 .select convert(datetime, '20030818')
3. dts datetime conversion facility
rohan
>--Original Message--
>Hi,
>I have a text file in which one field is a date but in
>file it is in '20030818' format and I am tring to import
>that file to a table which has a column datetime.
>is it possible to convert '20030818' to datetime during
>DTS import?
>if yes please let me know how ?
>Thanks.
>.
>|||use
1 .vbscript tranformation(activex)
2 .select convert(datetime, '20030818')
3. dts datetime conversion facility
rohan
>--Original Message--
>Hi,
>I have a text file in which one field is a date but in
>file it is in '20030818' format and I am tring to import
>that file to a table which has a column datetime.
>is it possible to convert '20030818' to datetime during
>DTS import?
>if yes please let me know how ?
>Thanks.
>.
>

Data Conversion failed due to Potential Loss of data

Hi,

I am getting this error when my ssis package is running

Data Conversion failed due to Potential Loss of data

the input column is in string format and output is in sql server bigint

the error is occuring when there is an empty string in the input. what should i do to overcome this

It is an ID field and should i convert to bigint or should i leave it as char datatype is it i a good solution or is there a way to over come this.

Add a derived column to either change the empty string to NULL or a zero. Up to you, but you can't insert a string into an integer field.|||

I am not sure why a string is being passed into a BigInt but I would not leave an input field null. I would use the Conditional operator ? : to provide the empty string a value of 0 if it is empty using the following in an expression:

ISNULL(<<input field>>) ? 0 : <<input field>>

In other words the above states that if the incoming field is NULL then fill it with a 0 otherwise pass the incoming value.

|||

desibull wrote:

I am not sure why a string is being passed into a BigInt but I would not leave an input field null. I would use the Conditional operator ? : to provide the empty string a value of 0 if it is empty using the following in an expression:

ISNULL(<<input field>>) ? 0 : <<input field>>

In other words the above states that if the incoming field is NULL then fill it with a 0 otherwise pass the incoming value.

NULL and "empty string" are two very different things.

To expand on what I suggested earlier and desibull's code above:

ISNULL([InputColumn]) || [InputColumn] == "" ? 0 : [InputColumn]

OR

ISNULL([InputColumn]) || [InputColumn] == "" ? NULL(DT_I8) : [InputColumn]

Sunday, March 11, 2012

Data Conversion Errors on Excel Import into existing table

Recently installed Sql Server 2005 client and am now attempting to import data from a spreadsheet into an existing table. This works fine with Sql Server 2000 but I am getting data conversion truncation errors that stop the process when this runs using import utility in Sql Server 2005.

Any help would be appreciated.

More information needed.

Why is it failing? What does the error message say?

You will need to open up the package and edit it to do the conversions that you require.

-Jamie

Data Conversion Error on Excel Destination

I am inserting rows using OLEDBDestination and want to redirect all error rows to EXCEL Destination.

I have used Data Conversion Transformation to Convert all strings to Unicode string fields before sending it to Excel Destination.

But its gives the following error.

[Data Conversion [16]] Error: Data conversion failed while converting column 'A' (53) to column "Copy of A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

Can someone please tell me what should I do to make it work?

Thanks,

Did you set the error output of the OLE Destination to "Redirect Rows"?

You might also have metadata problems and could maybe stand to recreate the Excel destination.|||

I have set the error ouput to redirect rows

Actually I have a Data flow task with Oledbsource -->Oledbdestination (Redirect Error Rows ) -- >Data Conversion -- >Excel Destination.

When I do this its gives me the following error.

Data conversion failed while converting column "A" (53) to column "A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABL

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

But for this data flow task Oledbsource -- > Data Conversion - > Excel Destination. Its works fine.

I am doing the same data conversion in both data flow tasks.

Please let me know what I am doing wrong.

|||Well, I suppose check and double check your column mappings going into and out of the OLE DB destination.|||

prg wrote:

I am inserting rows using OLEDBDestination and want to redirect all error rows to EXCEL Destination.

I have used Data Conversion Transformation to Convert all strings to Unicode string fields before sending it to Excel Destination.

But its gives the following error.

[Data Conversion [16]] Error: Data conversion failed while converting column 'A' (53) to column "Copy of A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

Can someone please tell me what should I do to make it work?

Thanks,

Actually I would check the Data conversion transform. Specifically the column A; since is there where the error is being generated. Try placing a data view to inspected the values that go into the data conversion. What happens if you delete column A from the data conversion? Does it fail in a different column?

|||

I have checked the Data conversion transform again. I have removed column A and used another column. Still it fails.

This is kind of weird. I cant figure out where the problem is .

Is it that EXCEL Destination cannot be used to redirect rows that have errors? Because its seems to work perfectly when I do OLEDBSource -> Data Conversion ->Excel Destination.

|||Well, if the errors are created (redirected) because of conversion errors, then it is natural that the redirected error rows don't match your destination data types. They can't because that's why they are in error.

Make sure that the data types of the Excel destination match the data types of your SOURCE data, not the data types of the OLE DB Destination table. Not sure if this will work because the metadata probably won't match...|||

Hi all,

I think I have exactly the same problem using a sql destination for error output.

Error: 0xC02020C5 at GS_ORDREC, Data Conversion [1827]: Data conversion failed while converting column "OL_DONORD" (102) to column "Test" (1880). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

Error: 0xC0209029 at GS_ORDREC, Data Conversion [1827]: The "output column "Test" (1880)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Test" (1880)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0047022 at GS_ORDREC, DTS.Pipeline: The ProcessInput method on component "Data Conversion" (1827) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

SSIS package "GS_ODSREC.dtsx" finished: Failure.

-

I check and recheck the package, everything is ok. Strange thing, when I try to convert the "ErrorColumn" field, the package works fine.

Regards

Ayzan

|||

Ayzan wrote:

Hi all,

I think I have exactly the same problem using a sql destination for error output.

Error: 0xC02020C5 at GS_ORDREC, Data Conversion [1827]: Data conversion failed while converting column "OL_DONORD" (102) to column "Test" (1880). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

Error: 0xC0209029 at GS_ORDREC, Data Conversion [1827]: The "output column "Test" (1880)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Test" (1880)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0047022 at GS_ORDREC, DTS.Pipeline: The ProcessInput method on component "Data Conversion" (1827) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

SSIS package "GS_ODSREC.dtsx" finished: Failure.

-

I check and recheck the package, everything is ok. Strange thing, when I try to convert the "ErrorColumn" field, the package works fine.

Regards

Ayzan

Okay, you *don't* have the same problem as the OP. If you are using a SQL Server destiantion, SSIS will not automatically cast/convert datatypes for you. The data flow datatypes (metadata) will need to match EXACTLY with the SQL Server destination table.|||

Thank you Phil Brammer,

Sorry this IS the same problem, cause I try with a Sql server destination, a data reader destination, a flat file destination, for redirecting the failed rows, and I have still the same message ... in spite of casting/converting datatypes

Regards

Ayzan

|||

Ayzan wrote:

Thank you Phil Brammer,

Sorry this IS the same problem, cause I try with a Sql server destination, a data reader destination, a flat file destination, for redirecting the failed rows, and I have still the same message ... in spite of casting/converting datatypes

Regards

Ayzan

Ah, see, you left out valuable information! So I'll ask this as I have done before. Before we continue, please provide the metadata information as it stands before going into the destination. Then, please provide the table column information for the destination as well.

One other thing to note is that there could be one bad row that is causing this error... Have you redirected errors and inspected them?|||

Ok,

There is a simple Data Flow, so as to reproduce the error:

#Data Source, OLEDB, AdventureWorksDW

#OLE DB Source, SQL Command :

SELECT TOP (10) TimeKey, OrganizationKey, DepartmentGroupKey, ScenarioKey, AccountKey+100 as AccountKey, Amount
FROM FactFinance

#OLE DB Destination

AdventureWorks.FactFinance

#Error Output to Data Conersion

TimeKey -> DT_STR

#DataReader Destination

Enjoy !

|||

Ayzan wrote:

Ok,

There is a simple Data Flow, so as to reproduce the error:

#Data Source, OLEDB, AdventureWorksDW

#OLE DB Source, SQL Command :

SELECT TOP (10) TimeKey, OrganizationKey, DepartmentGroupKey, ScenarioKey, AccountKey+100 as AccountKey, Amount
FROM FactFinance

#OLE DB Destination

AdventureWorks.FactFinance

#Error Output to Data Conersion

TimeKey -> DT_STR

#DataReader Destination

Enjoy !

I don't have a AdventureWorks.FactFinance table. I have the data warehouse table... Are you creating your own FactFinance table in the AdventureWorks table? The mere prefix of "Fact" indicates that it should be in AdventureWorksDW.|||

Sorry,

#OLE DB Destination

AdventureWorksDW.FactFinance

Well in fact, whatever the source/destination, you just have to generate an error while trying to insert new records.

Regards

Ayzan

|||

Ayzan wrote:

Sorry,

#OLE DB Destination

AdventureWorksDW.FactFinance

Well in fact, whatever the source/destination, you just have to generate an error while trying to insert new records.

Regards

Ayzan

I'm utterly confused now. So your source and destination tables are the same? What are you doing exactly, and what are you expecting to see? Why would I select the top 10 records from FactFinance and then turn around and insert them again? Regardless, nowhere should a DT_STR datatype be picked up on the Timekey field because it is an integer field.

Nevermind. I understand now. Hang on.

Data Conversion Error on Excel Destination

I am inserting rows using OLEDBDestination and want to redirect all error rows to EXCEL Destination.

I have used Data Conversion Transformation to Convert all strings to Unicode string fields before sending it to Excel Destination.

But its gives the following error.

[Data Conversion [16]] Error: Data conversion failed while converting column 'A' (53) to column "Copy of A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

Can someone please tell me what should I do to make it work?

Thanks,

Did you set the error output of the OLE Destination to "Redirect Rows"?

You might also have metadata problems and could maybe stand to recreate the Excel destination.|||

I have set the error ouput to redirect rows

Actually I have a Data flow task with Oledbsource -->Oledbdestination (Redirect Error Rows ) -- >Data Conversion -- >Excel Destination.

When I do this its gives me the following error.

Data conversion failed while converting column "A" (53) to column "A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABL

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

But for this data flow task Oledbsource -- > Data Conversion - > Excel Destination. Its works fine.

I am doing the same data conversion in both data flow tasks.

Please let me know what I am doing wrong.

|||Well, I suppose check and double check your column mappings going into and out of the OLE DB destination.|||

prg wrote:

I am inserting rows using OLEDBDestination and want to redirect all error rows to EXCEL Destination.

I have used Data Conversion Transformation to Convert all strings to Unicode string fields before sending it to Excel Destination.

But its gives the following error.

[Data Conversion [16]] Error: Data conversion failed while converting column 'A' (53) to column "Copy of A" (95). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

[Data Conversion [16]] Error: The "output column "Copy of A" (95)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Copy of A" (95)" specifies failure on error. An error occurred on the specified object of the specified component.

Can someone please tell me what should I do to make it work?

Thanks,

Actually I would check the Data conversion transform. Specifically the column A; since is there where the error is being generated. Try placing a data view to inspected the values that go into the data conversion. What happens if you delete column A from the data conversion? Does it fail in a different column?

|||

I have checked the Data conversion transform again. I have removed column A and used another column. Still it fails.

This is kind of weird. I cant figure out where the problem is .

Is it that EXCEL Destination cannot be used to redirect rows that have errors? Because its seems to work perfectly when I do OLEDBSource -> Data Conversion ->Excel Destination.

|||Well, if the errors are created (redirected) because of conversion errors, then it is natural that the redirected error rows don't match your destination data types. They can't because that's why they are in error.

Make sure that the data types of the Excel destination match the data types of your SOURCE data, not the data types of the OLE DB Destination table. Not sure if this will work because the metadata probably won't match...|||

Hi all,

I think I have exactly the same problem using a sql destination for error output.

Error: 0xC02020C5 at GS_ORDREC, Data Conversion [1827]: Data conversion failed while converting column "OL_DONORD" (102) to column "Test" (1880). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

Error: 0xC0209029 at GS_ORDREC, Data Conversion [1827]: The "output column "Test" (1880)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Test" (1880)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0047022 at GS_ORDREC, DTS.Pipeline: The ProcessInput method on component "Data Conversion" (1827) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

SSIS package "GS_ODSREC.dtsx" finished: Failure.

-

I check and recheck the package, everything is ok. Strange thing, when I try to convert the "ErrorColumn" field, the package works fine.

Regards

Ayzan

|||

Ayzan wrote:

Hi all,

I think I have exactly the same problem using a sql destination for error output.

Error: 0xC02020C5 at GS_ORDREC, Data Conversion [1827]: Data conversion failed while converting column "OL_DONORD" (102) to column "Test" (1880). The conversion returned status value 8 and status text "DBSTATUS_UNAVAILABLE".

Error: 0xC0209029 at GS_ORDREC, Data Conversion [1827]: The "output column "Test" (1880)" failed because error code 0xC020908E occurred, and the error row disposition on "output column "Test" (1880)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0047022 at GS_ORDREC, DTS.Pipeline: The ProcessInput method on component "Data Conversion" (1827) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

SSIS package "GS_ODSREC.dtsx" finished: Failure.

-

I check and recheck the package, everything is ok. Strange thing, when I try to convert the "ErrorColumn" field, the package works fine.

Regards

Ayzan

Okay, you *don't* have the same problem as the OP. If you are using a SQL Server destiantion, SSIS will not automatically cast/convert datatypes for you. The data flow datatypes (metadata) will need to match EXACTLY with the SQL Server destination table.|||

Thank you Phil Brammer,

Sorry this IS the same problem, cause I try with a Sql server destination, a data reader destination, a flat file destination, for redirecting the failed rows, and I have still the same message ... in spite of casting/converting datatypes

Regards

Ayzan

|||

Ayzan wrote:

Thank you Phil Brammer,

Sorry this IS the same problem, cause I try with a Sql server destination, a data reader destination, a flat file destination, for redirecting the failed rows, and I have still the same message ... in spite of casting/converting datatypes

Regards

Ayzan

Ah, see, you left out valuable information! So I'll ask this as I have done before. Before we continue, please provide the metadata information as it stands before going into the destination. Then, please provide the table column information for the destination as well.

One other thing to note is that there could be one bad row that is causing this error... Have you redirected errors and inspected them?|||

Ok,

There is a simple Data Flow, so as to reproduce the error:

#Data Source, OLEDB, AdventureWorksDW

#OLE DB Source, SQL Command :

SELECT TOP (10) TimeKey, OrganizationKey, DepartmentGroupKey, ScenarioKey, AccountKey+100 as AccountKey, Amount
FROM FactFinance

#OLE DB Destination

AdventureWorks.FactFinance

#Error Output to Data Conersion

TimeKey -> DT_STR

#DataReader Destination

Enjoy !

|||

Ayzan wrote:

Ok,

There is a simple Data Flow, so as to reproduce the error:

#Data Source, OLEDB, AdventureWorksDW

#OLE DB Source, SQL Command :

SELECT TOP (10) TimeKey, OrganizationKey, DepartmentGroupKey, ScenarioKey, AccountKey+100 as AccountKey, Amount
FROM FactFinance

#OLE DB Destination

AdventureWorks.FactFinance

#Error Output to Data Conersion

TimeKey -> DT_STR

#DataReader Destination

Enjoy !

I don't have a AdventureWorks.FactFinance table. I have the data warehouse table... Are you creating your own FactFinance table in the AdventureWorks table? The mere prefix of "Fact" indicates that it should be in AdventureWorksDW.|||

Sorry,

#OLE DB Destination

AdventureWorksDW.FactFinance

Well in fact, whatever the source/destination, you just have to generate an error while trying to insert new records.

Regards

Ayzan

|||

Ayzan wrote:

Sorry,

#OLE DB Destination

AdventureWorksDW.FactFinance

Well in fact, whatever the source/destination, you just have to generate an error while trying to insert new records.

Regards

Ayzan

I'm utterly confused now. So your source and destination tables are the same? What are you doing exactly, and what are you expecting to see? Why would I select the top 10 records from FactFinance and then turn around and insert them again? Regardless, nowhere should a DT_STR datatype be picked up on the Timekey field because it is an integer field.

Nevermind. I understand now. Hang on.

Data conversion error

I'm importing data from a text table, into a temp table, and then on to a final working table. I'm running into difficulty converting data from text into a 'datetime' format. I'm getting the following error:

"Arithmetic overflow error" when trying to convert into a column with the data type "DateTime"

I half expected it to reject all conversions into a Date format because of the source file being in text format, but it allows this conversion in other columns.

If I switch the Data type to 'nvarchar' instead of 'datetime' it converts and pops up with a date format.

My question is: Will this nvarchar that looks like a date behave like a date? For example, if someone tries to perform a calculation with a date format, will the nvarchar suffice, or would it cause problems?

Any ideas?It won't convert what's causing this error. You need to do something like this:

SELECT CASE WHEN ISDATE(text_field) = 0 THEN '01/01/01' ELSE CAST(text_field AS DATETIME) END
FROM table

Data conversion error

Hi all

Thanx for all your enthusiastic participation yesterday. I've got a new question on data conversion related to the same problem i asked about yesterday.

I have a SQL query as follows which generates the following error when it is executed against the database using sql query analyser and also through an ASP.NET application.

Error (Query Analyser):
The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.

So it's basically this line
(Convert(datetime, Request.closeDate, 103) > Convert (datetime, '8/25/2003', 103))
that is causing the problem

SELECT Request.requestID,
Users.clientCode,
Job.jobID,
Job.allocatedTime,
Job.spentTime,
Request.state,
Job.deadline

FROM Request,
Users,
Job,
Staff

WHERE (
Users.userID = Request.userID AND
Job.staff = Staff.staffID AND
Job.request = Request.requestID AND
(
(Job.staffProgress != 1 AND Request.state = 'In Progress') OR
(Convert(datetime, Request.closeDate, 103) > Convert (datetime, '8/25/2003', 103))
) AND
Request.state = 43
)
ORDER BY
Job.deadline,
Job.allocatedTime,
Request.priority

The weird thing is that this query works fine on our live server but fails when i try to execute it on the local machine. I think the live server is running SQL Server Service Pack 3, and the one on my local machine, i couldn't find out for some reason.

Any suggestions would be greatly appreciated

Cheers

Jamesthe same piece of sql query above also generated the following error.

Syntax error converting the varchar value 'In Progress' to a column of data type int.

The datatype of Request.state is varchar(20).|||I should mention all the associated data types in the WHERE block.

Job.staff (decimal)
Staff.staffID (decimal)
Job.request (decimal)
Request.requestID (decimal)

Job.staffProgress (float)
Request.state (varchar(20))
Request.closeDate (datetime)

Please help

Thanx

James|||Originally posted by nano_electronix
The datatype of Request.state is varchar(20).

Look again for your data type of Request.State! In your WHERE clause, you are comparing Request.State both with 'In Progress' and with 43. One of these are wrong, and your error message is clearly stating, that the data type of Request.State is INT!

Data Conversion Error

Hi,

I am transferring data from a Flat file source to SQL Database. Initially I got validation error saying

"Validation error. Import Employee Information: [DataFlowTask] [6384]: The column "[ColumnName] " can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported." ( i have 6 date fields)

The destination in SQL has the field in "datetime" format.


So i used a "Data Conversion" task to transform the fields, thereby resolving the validation errors. But when i Execute the dtsx I get the following error :

[Data Conversion [322]] Error: Data conversion failed while converting column "[ColumnName]" (79) to column "[Modified_ColumnName]" (346). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

[Data Conversion [322]] Error: The "output column "[Modified_ColumnName]" (346)" failed because error code 0xC020907F occurred, and the error row disposition on "output column "[Modified_ColumnName]" (346)" specifies failure on error. An error occurred on the specified object of the specified component.

[DTS.Pipeline] Error: The ProcessInput method on component "Data Conversion" (322) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029.

["InputFileName" Input [1]] Error: Setting the end of rowset for the buffer failed with error code 0xC0047020.

[DTS.Pipeline] Error: The PrimeOutput method on component "InputFileName Input" (1) returned error code 0xC0209017. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.

Can anybody help me with this?

The "The value could not be converted because of a potential loss of data" error generally occurs when you're trying to convert a wider data type value into a data type too narrow to hold it. Can you look at the metadata in your data flow to see what the exact data types are for the source ("[ColumnName]") and target ("[Modified_ColumnName]") columns?|||Can you post some of your sample data in the [ColumnName] column?|||

Hi,

I am transferring data from .dat file to tables in SQL DB. The output from .dat file is in [DT_STR] datatype.

[ColumnName1] [ColumnName2]

04/27/1987 0

05/20/1958 0

03/12/1982 1

05/04/1957 2

Error 23 Validation error. [TaskName]: [TableName] [9]: The column "[ColumnName1]" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

The table in the DB has "datetime" as its datatype

Error 26 Validation error. [TaskName]: [TableName] [9]: The column "[ColumnName2]" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

The table in the DB has "int" as its datatype

So i used the "data conversion" task in SSIS and resolved the validation error.

But when i executed the dtsx i got the error mentioned in my previous post

|||I'd add a derived column transformation between the source and the destination.

In that derived column, you'd cast the values appropriately.

Expressions below:
NewOutputColumnName1: (DT_DBTIMESTAMP)[ColumnName1]
NewOutputColumnName2: (DT_I4)[ColumnName2]

Data Conversion Components & Code Page issue

I am using the SSIS wizard to pull data from DB2 z/os to sql server. The data flow task that is created converts the data to DT_STR Ansi 1252 before storing to sql server database. The package is blowing up on in the data conversion component...no match in found in target code page...for my city name field.

My old dts wizard didn't have this problem. The forums seem to indicate that SSIS is no longer doing some of the implicit conversions that DTS did and I may have to do more than one conversion.

What format type/code page do I use for the other conversion? The code page for my DB2 data source is 37.

I've tried several scenarios and none of them have worked. Any hints?

Quick and dirty work around was to change the destination column to nchar.|||

This one is a good workaround if you don't mind your strings being Unicode at the destination.

If you would rather keep your ANSI strings, then you should go to the package set the appropriate code page in the data conversion transform and set the appropriate collation in the CREATE TABLE statement.

Thanks.

|||

I tried that and couldn't get it to work.

My problem isn't a foreign language issue, but a special character issue. My old DTS packages accepted the special characters without a fuss but I'm told SSIS isn't doing the implicit conversion that the old DTS use to do.

Obviously, I can't seem to figure out which code page to use. And where do I set all of these code pages? My input data is coming from DB2 z/OS which is using code page 37. Do I set the code page of the source component to 37? The destination? The transform component? I tried setting the code page with multiple scenarios...none worked.

There is no where that I can find that explains how to do a code page transformation except in the most general terms.

PS. Wouldn't I only change the collation for a foreign language?

Data Conversion and Derived Column issue

Hi all of you,

I think that I've done a big mess on my work... I've got plain file which must be loaded into a sql table. Up to there no problem, I use Derived Column due to columns needed be transformed with NULL, RIGHT, LEN, and so on...

But in this last package I've done half of work using Data Conversion but I've got five columns which would need be transformed but I don't know how can I do such thing. I can't connect a Derived Column from Flat File Source task, of course, it's already data conversion...

Let me know if you need further details.

Previously I think, silly idea, that it could be unified or better, that Data Conversion task allows me make transformations...

Thanks a lot for your ideas and thoughs,

Can't you just connect the output from the Data Conversion transform to the Derived Column transform, then connect it to your destination component?

|||

Yes, you're totally right. Everything's fine.

thanks

Data Conversion (String to DateTime)

I am trying to insert data from a web form to a SQL Database. I am receiving the following error: {"String was not recognized as a valid Boolean."} I am also receiving a similar error for text boxes that have dates.

Below is the code that I am using:

<asp:SqlDataSource

id="SqlDataSource1"

runat="server"

connectionstring="<%$ ConnectionStrings:ConnMktProjReq %>"

selectcommand="SELECT LoanRepName,Branch,CurrentDate,ReqDueDate,ProofByEmail,ProofByEmail,FaxNumber,ProjectExplanation,PrintQuantity,PDFDisc,PDFEmail,LoanRepEmail FROM MktProjReq"

insertcommand="INSERT INTO MktProjReq(LoanRepName, Branch, CurrentDate, ReqDueDate, ProofByEmail, ProofByEmail, FaxNumber, ProjectExplanation, PrintQuantity, PDFDisc, PDFEmail, LoanRepEmail) VALUES (@.RepName, @.BranchName, @.Date, @.DueDate, @.ByEmail, @.ByFax, @.Fax, @.ProjExp, @.PrintQty, @.Disc, @.Email, @.RepEmail)">

<InsertParameters>

<asp:FormParameterName="RepName"FormField="LoanRepNameBox"/>

<asp:FormParameterName="BranchName"FormField="BranchList"/>

<asp:FormParameterName="Date"FormField="CurrentDateBox"Type="DateTime"/>

<asp:FormParameterName="DueDate"FormField="ReqDueDateBox"Type="DateTime"/>

<asp:FormParameterName="ByEmail"FormField="ProofByEmailCheckbox"Type="boolean"/>

<asp:FormParameterName="ByFax"FormField="ProofByFaxCheckbox"Type="boolean"/>

<asp:FormParameterName="Fax"FormField="FaxNumberBox"/>

<asp:FormParameterName="ProjExp"FormField="ProjectExplanationBox"/>

<asp:FormParameterName="PrintQty"FormField="PrintQuantityBox"/>

<asp:FormParameterName="Disc"FormField="PDFByDiscCheckbox"Type="boolean"/>

<asp:FormParameterName="Email"FormField="PDFByFaxCheckbox"Type="boolean"/>

<asp:FormParameterName="RepEmail"FormField="LoanRepEmailBox"/>

</InsertParameters>

</asp:SqlDataSource>

protectedvoid Button1_Click(object sender,EventArgs e)

{

SqlDataSource1.Insert();

}

I have been searching forums for parsing data, but I haven't found anything that works. Can anyone provide guidance.

Thank you,

Paul

What about remove (cross out in the following) the Type="boolean" for these formfields:

<asp:FormParameterName="ByEmail"FormField="ProofByEmailCheckbox"Type="boolean" />

<asp:FormParameterName="ByFax"FormField="ProofByFaxCheckbox" Type="boolean" />

<asp:FormParameterName="Disc"FormField="PDFByDiscCheckbox" Type="boolean" />

<asp:FormParameterName="Email"FormField="PDFByFaxCheckbox" Type="boolean" />

|||

Thank you, that ended the exception errors.

Paul

data conversion -- varchar to nvarchar

Hello,
The following question applies to SQL Server 2000 SP3 and SQL Server 2005:
We are in the process of investigating international support for our
application and part of that will likely require changing all of the
char/varchar columns in our database to nchar/nvarchar.
From what I gather by reading these forums, altering the columns in place
via ALTER TABLE statements is an acceptable method for doing this.
The question that I have been unable to completely confirm, however, is what
happens to existing data in the tables? Does the data get converted to
Unicode as part of the ALTER TABLE statement? And, if so, is there any risk
that this conversion will produce unexpected or unpredictable results?
Preliminary testing seems to indicate that the data does get converted
successfully during this process but I am just trying to confirm that
suspicion.
Thanks in advance for any assistance,
JohnLx> From what I gather by reading these forums, altering the columns in place
> via ALTER TABLE statements is an acceptable method for doing this.
> The question that I have been unable to completely confirm, however, is
> what
> happens to existing data in the tables? Does the data get converted to
> Unicode as part of the ALTER TABLE statement?
Yes, but of course there can't already be any data in there that requires
Unicode. Other than disk space requirements for those columns doubling (and
don't forget indexes), you shouldn't see any noticeable difference, except
(see next comment).

> And, if so, is there any risk
> that this conversion will produce unexpected or unpredictable results?
Absolutely. For char/varchar in the smaller size ranges (up to 4000) you
shouldn't see any problems. However, if you have a varchar(8000) with at
least one tuple with 4001 or more characters, you will get the following
when you try to convert VARCHAR(8000) to NVARCHAR(4000) (the largest size
for varying double-wide):
Msg 8152, Level 16, State 4, Line 1
String or binary data would be truncated.
The statement has been terminated.
In addition, you'll want to reindex and update statistics for any tables
that have indexes/statistics on the column(s) you're changing. You'll also
want to verify that you re-compile (and alter params and converts, where
necessary) any stored procedures or functions that reference the columns.
You might also want to issue sp_refreshview for any views that reference
those tables.
A|||John,
the data get's converted in place as long as they fit into the new
datatype. Inyour case there shouldn't be an problem. The only thing
you have to keep in mind is while you can store up 8000 char in a
varchar column, nvarchar uses twice the space and thus the limit is
4000 characters.
Markus|||Whenever you alter the structure of a table, it can cause index
fragmentation. You will want to use DBCC INDEXDEFRAG for each altered table.
http://www.microsoft.com/technet/pr...n/ss2kidbp.mspx
"johnlx" <nomail@.discussions.microsoft.com> wrote in message
news:2288C9DC-40B8-48E0-A63B-143A622A7030@.microsoft.com...
> Hello,
> The following question applies to SQL Server 2000 SP3 and SQL Server 2005:
> We are in the process of investigating international support for our
> application and part of that will likely require changing all of the
> char/varchar columns in our database to nchar/nvarchar.
> From what I gather by reading these forums, altering the columns in place
> via ALTER TABLE statements is an acceptable method for doing this.
> The question that I have been unable to completely confirm, however, is
> what
> happens to existing data in the tables? Does the data get converted to
> Unicode as part of the ALTER TABLE statement? And, if so, is there any
> risk
> that this conversion will produce unexpected or unpredictable results?
> Preliminary testing seems to indicate that the data does get converted
> successfully during this process but I am just trying to confirm that
> suspicion.
> Thanks in advance for any assistance,
> JohnLx
>|||Thanks to everyone who responded. You confirmed what I thought with regards
to the data conversion.
Aaron, good point about the views. I hadn't thought to rebuild them but
will include that in the process.
Thanks,
--John Lennox|||On Mon, 19 Dec 2005 14:23:08 -0500, Aaron Bertrand [SQL Server MVP]
wrote:
(snip)
>In addition, you'll want to reindex and update statistics for any tables
>that have indexes/statistics on the column(s) you're changing. You'll also
>want to verify that you re-compile (and alter params and converts, where
>necessary) any stored procedures or functions that reference the columns.
>You might also want to issue sp_refreshview for any views that reference
>those tables.
Hi Aaron (and JohnLx),
And to avoid implicit conversions that might damage performance, the
next step would be to check all variable declarations, temp table
definitions and table variable definitions - they too should be changed
from varchar to nvarchar and from char to nchar.
And finally, put an N in front of all string constants in your code.
(I.e., use SET @.var = N'Text', not SET @.var = 'Text')
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)

data conversion - numeric to string loosing precision

Hi All,

i'm using a "data conversion" object to convert a numeric field to a string just before i save the record set to the database.

the problem is when this numeric field is > 0 it looses the precision on its decimal value.

example, if numeric value is 0.32

after converting this to a string, the new value will be : .32

it's lost the 0 infront of it. i can't do this converion in the query level because it's a derived field, so i need to convert it to a string before stroing it.

when converting to string i'm using the code page 1252 (ANSI - Latin I). i also tried with unicode string both looses this 0 infront.

can any one help on this?

Is your ultimate database target for the numeric data type a character based column, hence the need to retain the leading zeros?

If that is the case, retaining leading zeros can be accomplished with a derived column transform, rather than a data conversion transform. There you have access to the SSIS expression language, with a relatively standard, if meager, set of string functions, as well as type casts.

Another alternative for pretty much an data type conversion is a Script Transform, which will afford you the full power of the .NET framework, including in this case custom numeric formats for types destinated for strings.|||

Hi jaegd,

i tried that, but no luck. i use a derived table and this is my expression

ISNULL(investmentPercentage) ? 0.00 : investmentPercentage

here investmentPercentage is a numeric (15,2). i add the above expression as a new column (string) (8 chars). but still it looses the 0

.00 when it's null

.32 when it;s 0.32

any comments

AJ

|||

Use the Script Task, and follow the instructions in this link for numeric format specifiers for strings. You need to do something similar to this...

Dim number as float

number.ToString("D")

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpguide/html/cpconstandardnumericformatstrings.asp

|||

In the derived column, you can still acheive it with conditional logic The script as mentioned above is cleaner, but if you want to avoid adding that, you can try an expression like:

(investmentPercentage < 1 ? "0" : "") + (DT_WSTR, 20)investmentPercentage

This expression will add a "0" to the beginning of the string if the number is less than one (which is the only time the zero is dropped). When the number is greater than or equal to 1, an empty string is added.

NOTE: this does not take into account negative values. If investmentPercentage can be negative, you will need to tweak the expression a little (or post here, and I can help with that).

Thanks
Mark

|||

Hi Mark,

thanks for that, but it sort of half answer my question.

yes investmentPercentage can be negative or can even be null. i was wondering if there's an eaisier way. i've got a lot of cloumns like this, writing an expression for each of them this long can be tedious. is this how you would "tweak" it?

ISNULL(investmentPercentage) ? "0.00" : (investmentPercentage < 1 && investmentPercentage >-1 ? "0" : "") + (DT_WSTR, 8)investmentPercentage

|||

You might find it easier to use string formatting in the script component... I can't think of any other way that would be better.

As for tweaking the expression, you will have to do some extra things to take care of the negative sign (so the prepended zero doesn't end up before the negative sign). The following expression adds a case for values between -1 and 0 to handle that:

ISNULL(investmentPercentage) ? (DT_WSTR,8)"0.00" : investmentPercentage < 0 && investmentPercentage > -1 ? "-0" + SUBSTRING((DT_WSTR,8)investmentPercentage, 2, 8) : ((investmentPercentage >= 0 && investmentPercentage < 1 ? "0" : "") + (DT_WSTR,8)investmentPercentage

Let me know if this does the trick for you.

Thanks
Mark

Data Conversion - NULL datetime causes failure

Hi,

I have a flat file source and am bringing it into an OLE DB Destination connected to a SQL Server 2005 database. Since I am bringing in the data via a flat file, I have a conversion step in the data flow, which uses a "Data Conversion" transformation to convert each column to the appropriate destination data type.

When I run my SSIS package, if the datetime column in the flatfile is NULL, then the Data Conversion step fails. I have my Data COnversion step set to redirect error rows to an error output. So, I get all the rows with populated datetime values in the intended destination, and none of the rows with null datetime values (these go to the error destination.)

This is a problem! I need to make sure all valid rows fall through. It seems very odd to me that a NULL datetime could not be converted to a datetime data type. This makes no sense, since a datetime column can naturally hold NULL, just like any other type. So, my question is: is this a bug? It seems to me it is. I can't believe this would be the desired behavior?!

Microsoft: please let us know whether this is expected behavior. If so, then what would the desired workaround be?

Thanks

Do you have the retain nulls option checked in the Flat File Source?

If you are treating the column as a string in your flat file source, you may need to use a Derived Column transform to handle the conversion. Use an expression like:

Code Snippet

([YourDateColumn]=="")|| (ISNULL([YourDateColumn])) ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)[YourDateColumn]

|||Thanks much - the checkbox in the flatfile source did the trick. Never noticed that before, but it worked great. Thanks again.