Showing posts with label saying. Show all posts
Showing posts with label saying. Show all posts

Sunday, March 11, 2012

Data Conversion Error

Hi,

I am transferring data from a Flat file source to SQL Database. Initially I got validation error saying

"Validation error. Import Employee Information: [DataFlowTask] [6384]: The column "[ColumnName] " can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported." ( i have 6 date fields)

The destination in SQL has the field in "datetime" format.


So i used a "Data Conversion" task to transform the fields, thereby resolving the validation errors. But when i Execute the dtsx I get the following error :

[Data Conversion [322]] Error: Data conversion failed while converting column "[ColumnName]" (79) to column "[Modified_ColumnName]" (346). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

[Data Conversion [322]] Error: The "output column "[Modified_ColumnName]" (346)" failed because error code 0xC020907F occurred, and the error row disposition on "output column "[Modified_ColumnName]" (346)" specifies failure on error. An error occurred on the specified object of the specified component.

[DTS.Pipeline] Error: The ProcessInput method on component "Data Conversion" (322) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029.

["InputFileName" Input [1]] Error: Setting the end of rowset for the buffer failed with error code 0xC0047020.

[DTS.Pipeline] Error: The PrimeOutput method on component "InputFileName Input" (1) returned error code 0xC0209017. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.

Can anybody help me with this?

The "The value could not be converted because of a potential loss of data" error generally occurs when you're trying to convert a wider data type value into a data type too narrow to hold it. Can you look at the metadata in your data flow to see what the exact data types are for the source ("[ColumnName]") and target ("[Modified_ColumnName]") columns?|||Can you post some of your sample data in the [ColumnName] column?|||

Hi,

I am transferring data from .dat file to tables in SQL DB. The output from .dat file is in [DT_STR] datatype.

[ColumnName1] [ColumnName2]

04/27/1987 0

05/20/1958 0

03/12/1982 1

05/04/1957 2

Error 23 Validation error. [TaskName]: [TableName] [9]: The column "[ColumnName1]" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

The table in the DB has "datetime" as its datatype

Error 26 Validation error. [TaskName]: [TableName] [9]: The column "[ColumnName2]" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

The table in the DB has "int" as its datatype

So i used the "data conversion" task in SSIS and resolved the validation error.

But when i executed the dtsx i got the error mentioned in my previous post

|||I'd add a derived column transformation between the source and the destination.

In that derived column, you'd cast the values appropriately.

Expressions below:
NewOutputColumnName1: (DT_DBTIMESTAMP)[ColumnName1]
NewOutputColumnName2: (DT_I4)[ColumnName2]

Saturday, February 25, 2012

Data and index pages

As found in a book saying about Index pages:
'Unlike data pages, there is no limit to the total number of entries that
can be made on an index page.'
What is that mean ?
It's wrong. An index page has the exact same limit as a data page. There's
8k in a page minus the header size and the slot array and that's all the
space that available on the page for records (entries). There's a finite
limit defined using the minimum possible record size plus the slot array
overhead for each record - I'd work it out for you exactly but I can't
remember the minimum possible record size off the top of my head. The max is
somewhere around 310-320 records per page though.
Regards
Paul Randal
Dev Lead, Microsoft SQL Server Storage Engine
This posting is provided "AS IS" with no warranties, and confers no rights.
"Alan" <NOSPAMalan_pltse@.yahoo.com.au> wrote in message
news:uXr6ainqEHA.1644@.tk2msftngp13.phx.gbl...
> As found in a book saying about Index pages:
> 'Unlike data pages, there is no limit to the total number of entries that
> can be made on an index page.'
> What is that mean ?
>

Data and index pages

As found in a book saying about Index pages:
'Unlike data pages, there is no limit to the total number of entries that
can be made on an index page.'
What is that mean ?It's wrong. An index page has the exact same limit as a data page. There's
8k in a page minus the header size and the slot array and that's all the
space that available on the page for records (entries). There's a finite
limit defined using the minimum possible record size plus the slot array
overhead for each record - I'd work it out for you exactly but I can't
remember the minimum possible record size off the top of my head. The max is
somewhere around 310-320 records per page though.
Regards
--
Paul Randal
Dev Lead, Microsoft SQL Server Storage Engine
This posting is provided "AS IS" with no warranties, and confers no rights.
"Alan" <NOSPAMalan_pltse@.yahoo.com.au> wrote in message
news:uXr6ainqEHA.1644@.tk2msftngp13.phx.gbl...
> As found in a book saying about Index pages:
> 'Unlike data pages, there is no limit to the total number of entries that
> can be made on an index page.'
> What is that mean ?
>