Showing posts with label export. Show all posts
Showing posts with label export. Show all posts

Tuesday, March 27, 2012

Data Export Wizard - Drop Table option not available

I was using the Export Data Wizard to transfer my data to hosting provider, however the option to drop detsination table first is greyed out. Any Idea? Als Why do I need to do in for each table by clicking Edit? Is there a global setting like in the old Wizard?

Make sure your source and destination databases are different.

Cheers,
Dan

|||

Different server, but same name of database. I may have some restrictions on the destination server, since it is hosting environment. However I can create tables.

Also How to to this globally for all tables at once, instead of clicking Options one by one?

|||What could be the reason that the option "drop and recrete destination table" is grayed out in the Export Wizard (after clicking table edit button)? Is this realted to security?|||

Does anyone have a solution for this?

As soon as the database name is the same this option is disabled? Is this a bug? I have full permissions so it can't be a security issue.

sql

Sunday, March 25, 2012

Data Export Wizard - Drop Table option not available

I was using the Export Data Wizard to transfer my data to hosting provider, however the option to drop detsination table first is greyed out. Any Idea? Als Why do I need to do in for each table by clicking Edit? Is there a global setting like in the old Wizard?

Make sure your source and destination databases are different.

Cheers,
Dan

|||

Different server, but same name of database. I may have some restrictions on the destination server, since it is hosting environment. However I can create tables.

Also How to to this globally for all tables at once, instead of clicking Options one by one?

|||What could be the reason that the option "drop and recrete destination table" is grayed out in the Export Wizard (after clicking table edit button)? Is this realted to security?|||

Does anyone have a solution for this?

As soon as the database name is the same this option is disabled? Is this a bug? I have full permissions so it can't be a security issue.

Data Export with ADO

Hi All,

I want to develop a small tool to export data from a database (SQL Server and Oracle). I have to export full tables or part of tables depending on an SQL statement.

My first idea is to use ADO. I open a recordset on the table (adCmdTable) and I call the Save method to save the recordset in a file. But, when I open the recordset, it takes a long time ... I suppose that ADO performs a " select * from table " to retreive all the lines ... It's too long for me because some tables have more then 10 millions of lines !!!

Do you known if it's possible to open a table without any "select" (just open and save) ? Do you kown others solutions ? I can develop with DO or ADO.Net.

Thanks in advance for your help.

Fran?ois.

Using adCmdTable causes the client to wait until the table parameters are collected, including a row count.

You should be using ADO.NET (and Visual Studio.NET).

ADO.NET is a disconnected model, so there is no traffic UNTIL you execute your query, returning only the data requested. Any 'slowness' at the start up will be due to the overhead of establishing the connection, and if you judiciously use connection pooling, that may not be much of an issue.

|||Have you looked into SSIS (SQL server integration Service) shipped as part of SQL 2005? You might not need to roll your own tool after all.|||

I try to use the WriteXml method of a DataSet to export my data. I don't want to user a "select * from table" select statement because the number of rows can be very important (more that 10 millions of lines), so I want to set the CommendType if my SqlCommand objet to TableDirect, but I have an error message : "this is not supported by SlqCLient .Net Framework" ! Is there any solution to biend a dataset on a table (without any SQL statement) and call the WriteXml method ?

Thanks.

Fran?ois.

|||SSIS is nice if you use SQL Server. But, unfortunatly, we have some customers with Oracle ... That's why I try to find a "universal" solution.|||No, 'unfortunately' using SQL Server requires the use of the SQL language. (Go figure...)

Data export to individual files

Hello,
I am looking to T-SQL extract records from a table and have each record
stored in individual .eml files. A query based DTS doesn't let me
specify what extension my files will have AND I'm unsure how to have
each record written to a separate file? Any ideas on how I can pull
this off?
Much appreciated,
Prpryan75 wrote:
> Hello,
> I am looking to T-SQL extract records from a table and have each record
> stored in individual .eml files. A query based DTS doesn't let me
> specify what extension my files will have AND I'm unsure how to have
> each record written to a separate file? Any ideas on how I can pull
> this off?
> Much appreciated,
> Pr
>
A few possibilities:
1. Use OSQL to run the query and pipe the output to the desired file
2. Use xp_cmdshell to issue DOS "ECHO" commands to write the desired file
3. Use a combination of xp_cmdshell and OPENROWSET to create and then
write the desired file
Tracy McKibben
MCDBA
http://www.realsqlguy.com

Data export to individual files

Hello,
I am looking to T-SQL extract records from a table and have each record
stored in individual .eml files. A query based DTS doesn't let me
specify what extension my files will have AND I'm unsure how to have
each record written to a separate file? Any ideas on how I can pull
this off?
Much appreciated,
Prpryan75 wrote:
> Hello,
> I am looking to T-SQL extract records from a table and have each record
> stored in individual .eml files. A query based DTS doesn't let me
> specify what extension my files will have AND I'm unsure how to have
> each record written to a separate file? Any ideas on how I can pull
> this off?
> Much appreciated,
> Pr
>
A few possibilities:
1. Use OSQL to run the query and pipe the output to the desired file
2. Use xp_cmdshell to issue DOS "ECHO" commands to write the desired file
3. Use a combination of xp_cmdshell and OPENROWSET to create and then
write the desired file
Tracy McKibben
MCDBA
http://www.realsqlguy.com

data export to flat file

Hi, I hope you can help, and thank you.
SQL 2000. I need to generate a flat file export from a single table. I need
a line feed between select fields. I've tried using bcp and Bulk Insert with
a format file, but I've never used either, and I must be missing a critical
step somewhere.
The flat file format for a single record would be similar to the following:
Field1, Field2, Field3, Field4, Field5, Image Path1
Image Path2
Image Path3
I appreciate any assistance you can provide. Thanks!
Did you try the DTS (data transfomation service)tool in SQL Server 2000?
Easy to use for transforming data to export files.
http://www.microsoft.com/technet/prodtechnol/sql/2000/deploy/dtssql2k.mspx
Good luck.
sql

data export to flat file

Hi, I hope you can help, and thank you.
SQL 2000. I need to generate a flat file export from a single table. I nee
d
a line feed between select fields. I've tried using bcp and Bulk Insert with
a format file, but I've never used either, and I must be missing a critical
step somewhere.
The flat file format for a single record would be similar to the following:
Field1, Field2, Field3, Field4, Field5, Image Path1
Image Path2
Image Path3
I appreciate any assistance you can provide. Thanks!Did you try the DTS (data transfomation service)tool in SQL Server 2000?
Easy to use for transforming data to export files.
http://www.microsoft.com/technet/pr...y/dtssql2k.mspx
Good luck.

data export to CSV

I have a report that produces over 3 million rows, this report is
export to CSV and is scheduled to run weekly.
Sometimes the report is needed to run manually. Initially this report
timed out, now the report runs but takes a very long time to complete.
If the stored procedure is run directly it takes about 5 mins to
execute. Running in reporting services takes upward of an hour.
Does anyone have suggestions on how to speed up this report.
my initial thoughts would be to use SSRS to populate the parameters
and maybe hook into the reporting engine and execute the stored
procedure directly with the parameters defined
thanks in advance
IanHi, how run this report? http? web service? how?
--
----
Microsoft M.V.P en SQLServer
SQLTotal Consulting - Servicios en SQLServer
Email: maxi.da@.gmail.com.nospam
----
<emery.ian@.googlemail.com> escribió en el mensaje
news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>I have a report that produces over 3 million rows, this report is
> export to CSV and is scheduled to run weekly.
> Sometimes the report is needed to run manually. Initially this report
> timed out, now the report runs but takes a very long time to complete.
> If the stored procedure is run directly it takes about 5 mins to
> execute. Running in reporting services takes upward of an hour.
> Does anyone have suggestions on how to speed up this report.
> my initial thoughts would be to use SSRS to populate the parameters
> and maybe hook into the reporting engine and execute the stored
> procedure directly with the parameters defined
>
> thanks in advance
> Ian|||Do not waste your time trying to run the report different ways, it will not
affect the time.
Your time is coming in two places. One, does it really only take 5 minutes
to extract 5 million records? I'm not sure how it works when executing a
stored procedure from Query Analyzer that returns 5 million records, whether
it really retrieves all the records or not. Anyway, I think that is one
area, the number of records being physically returned takes time.
Second, RS 2000 and RS 2005 does all rendering in RAM. It does not page
anything out to disk or utilize the disk at all when rendering. I am pretty
sure this is changing substantially in RS 2008. Adding more RAM to the
server will help.
The other option is to use SSIS instead.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<emery.ian@.googlemail.com> wrote in message
news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>I have a report that produces over 3 million rows, this report is
> export to CSV and is scheduled to run weekly.
> Sometimes the report is needed to run manually. Initially this report
> timed out, now the report runs but takes a very long time to complete.
> If the stored procedure is run directly it takes about 5 mins to
> execute. Running in reporting services takes upward of an hour.
> Does anyone have suggestions on how to speed up this report.
> my initial thoughts would be to use SSRS to populate the parameters
> and maybe hook into the reporting engine and execute the stored
> procedure directly with the parameters defined
>
> thanks in advance
> Ian|||An additional point, BCP is the tool to use (not SSIS and not RS) if
performance is your goal. Give a DBA your query used by the report and they
could have BCP written and scripted for you in no time. It's been over a
decade since I used BCP and I had forgotten about it.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
news:eES$LTZPIHA.4272@.TK2MSFTNGP06.phx.gbl...
> Do not waste your time trying to run the report different ways, it will
> not affect the time.
> Your time is coming in two places. One, does it really only take 5 minutes
> to extract 5 million records? I'm not sure how it works when executing a
> stored procedure from Query Analyzer that returns 5 million records,
> whether it really retrieves all the records or not. Anyway, I think that
> is one area, the number of records being physically returned takes time.
> Second, RS 2000 and RS 2005 does all rendering in RAM. It does not page
> anything out to disk or utilize the disk at all when rendering. I am
> pretty sure this is changing substantially in RS 2008. Adding more RAM to
> the server will help.
> The other option is to use SSIS instead.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> <emery.ian@.googlemail.com> wrote in message
> news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>>I have a report that produces over 3 million rows, this report is
>> export to CSV and is scheduled to run weekly.
>> Sometimes the report is needed to run manually. Initially this report
>> timed out, now the report runs but takes a very long time to complete.
>> If the stored procedure is run directly it takes about 5 mins to
>> execute. Running in reporting services takes upward of an hour.
>> Does anyone have suggestions on how to speed up this report.
>> my initial thoughts would be to use SSRS to populate the parameters
>> and maybe hook into the reporting engine and execute the stored
>> procedure directly with the parameters defined
>>
>> thanks in advance
>> Ian
>

Data export from SQl in CSV format

Hi
I want to export data out of SQL database using a query into a Comma Delimited(CSV) file. Some one suggested that I can use BCP.exe, I am not sure how to use it properly.
Can some please suggest how to do it? I am not a SQL DBA & is struggling with this.It's much easier to do this using the DTS Data Export under the Tools menu
in the Enterprise Manager. Pick Text File as your destination. The default
format is CSV with embedded double quotes correctly handled.
If you use BCP, you'll have to deal with the embedded double quotes in each
column where that's an possibility, using a TSQL function such as REPLACE().
--
Linchi Shea
linchi_shea@.NOSPAMml.com
"singh" <anonymous@.discussions.microsoft.com> wrote in message
news:6ECE17F7-1A1D-4526-A94E-DCD97F90E04C@.microsoft.com...
> Hi
> I want to export data out of SQL database using a query into a Comma
Delimited(CSV) file. Some one suggested that I can use BCP.exe, I am not
sure how to use it properly.
> Can some please suggest how to do it? I am not a SQL DBA & is struggling
with this.
>|||You can also use command line osql.exe and there is documentation in Books
on Line on the syntax.
Rand
This posting is provided "as is" with no warranties and confers no rights.|||Rand;
Could you please elaborate? Are you talking about the -s command line option
of osql.exe?
--
Linchi Shea
linchi_shea@.NOSPAMml.com
"Rand Boyd [MS]" <rboyd@.onlinemicrosoft.com> wrote in message
news:4roszLKqDHA.2604@.cpmsftngxa06.phx.gbl...
> You can also use command line osql.exe and there is documentation in Books
> on Line on the syntax.
> Rand
> This posting is provided "as is" with no warranties and confers no rights.
>|||The -s parameter does allow you to define a column delimiter. The default,
I believe, is a tab.
Rand
This posting is provided "as is" with no warranties and confers no rights.

Data Encryption

Hi,

We need to set up a data export process from a SQL DB.

The output (be it XML, Text Files or whatever) needs to be encrypted before it is FTPd somewhere.

Is there support for encrption in SSIS? How / where in the package designer would you achive this?

Thanks in advance.

Martin

There is no built in support for encryption of data. It would be an interesting custom task though if you fancy having a go.

Otherwise, request this for a future enhancement at http://connect.microsoft.com

-Jamie

|||Thanks, shame I was hoping to use it as a lever to kick the upgrade process off from SQL2000.|||

Actually come to think of it - you could leverage .Net's encyption routines quite easily using the script component.

Try doing that!

-Jamie

|||I've been working on an encryption transform actually. We're waiting to get everything in place from corp to release it. If you would like, send an e-mail to jason.gerard at idea.com and I'll let you know when it's ready.

In the meantime, you could use the .NET ecryption API's from inside a Script Transform.

Wednesday, March 7, 2012

Data being cut off on text import

I am trying to use SQL Server 2000 import/export funtion to import a
tab delimited text file. Some of the lines have upwards fo 27,000
characters in it. The first column of data is only 16 characters, the
rest in the second column. When I import it works however end up on
the largest data set to only have 8194 characters of data in the
field. I am importing the data into an NTEXT field but no joy. Any
suggestions?
Thanks.
JR
How are you determining that there are 8194 characters? Are you selecting
the data in Query Analyzer, copying it, and measuring the length? In this
case, the output is limited to 8192 characters, and does not necessarily
truly represent the data in the column. Try applying DATALENGTH() to the
column directly.
"JR" <jriker1@.yahoo.com> wrote in message
news:4d9da209-6bd6-4537-a3f9-184b6544ff49@.l1g2000hsa.googlegroups.com...
>I am trying to use SQL Server 2000 import/export funtion to import a
> tab delimited text file. Some of the lines have upwards fo 27,000
> characters in it. The first column of data is only 16 characters, the
> rest in the second column. When I import it works however end up on
> the largest data set to only have 8194 characters of data in the
> field. I am importing the data into an NTEXT field but no joy. Any
> suggestions?
> Thanks.
> JR

Data being cut off on text import

I am trying to use SQL Server 2000 import/export funtion to import a
tab delimited text file. Some of the lines have upwards fo 27,000
characters in it. The first column of data is only 16 characters, the
rest in the second column. When I import it works however end up on
the largest data set to only have 8194 characters of data in the
field. I am importing the data into an NTEXT field but no joy. Any
suggestions?
Thanks.
JRHow are you determining that there are 8194 characters? Are you selecting
the data in Query Analyzer, copying it, and measuring the length? In this
case, the output is limited to 8192 characters, and does not necessarily
truly represent the data in the column. Try applying DATALENGTH() to the
column directly.
"JR" <jriker1@.yahoo.com> wrote in message
news:4d9da209-6bd6-4537-a3f9-184b6544ff49@.l1g2000hsa.googlegroups.com...
>I am trying to use SQL Server 2000 import/export funtion to import a
> tab delimited text file. Some of the lines have upwards fo 27,000
> characters in it. The first column of data is only 16 characters, the
> rest in the second column. When I import it works however end up on
> the largest data set to only have 8194 characters of data in the
> field. I am importing the data into an NTEXT field but no joy. Any
> suggestions?
> Thanks.
> JR

Data being cut off on text import

I am trying to use SQL Server 2000 import/export funtion to import a
tab delimited text file. Some of the lines have upwards fo 27,000
characters in it. The first column of data is only 16 characters, the
rest in the second column. When I import it works however end up on
the largest data set to only have 8194 characters of data in the
field. I am importing the data into an NTEXT field but no joy. Any
suggestions?
Thanks.
JRHow are you determining that there are 8194 characters? Are you selecting
the data in Query Analyzer, copying it, and measuring the length? In this
case, the output is limited to 8192 characters, and does not necessarily
truly represent the data in the column. Try applying DATALENGTH() to the
column directly.
"JR" <jriker1@.yahoo.com> wrote in message
news:4d9da209-6bd6-4537-a3f9-184b6544ff49@.l1g2000hsa.googlegroups.com...
>I am trying to use SQL Server 2000 import/export funtion to import a
> tab delimited text file. Some of the lines have upwards fo 27,000
> characters in it. The first column of data is only 16 characters, the
> rest in the second column. When I import it works however end up on
> the largest data set to only have 8194 characters of data in the
> field. I am importing the data into an NTEXT field but no joy. Any
> suggestions?
> Thanks.
> JR

Friday, February 17, 2012

CVS Export in ASCII format

For the CSV export problem, where excel opens data into one column, I
had to add a report link using the below code to fix the problem. Is
there a way to change the underlying export encoding in reporting
services?
="javascript:void(window.open(top.frames[0].frames[1].location.href.replace('Format=HTML4.0','Format=CSV&rc%3aEncoding=ASCII'),'_blank'))"There is for RS 2005. Not for RS 2000.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<slov1@.hotmail.com> wrote in message
news:1135297330.158158.120010@.g49g2000cwa.googlegroups.com...
> For the CSV export problem, where excel opens data into one column, I
> had to add a report link using the below code to fix the problem. Is
> there a way to change the underlying export encoding in reporting
> services?
> ="javascript:void(window.open(top.frames[0].frames[1].location.href.replace('Format=HTML4.0','Format=CSV&rc%3aEncoding=ASCII'),'_blank'))"
>