Hi ,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ' anything ...
Thanks,I'd just understand the data, and then populate tables manually, using
INSERT statements, to populate the primary key and foreign key tables.
--
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"Mani" <mani@.spamthis@.evikasystems.com> wrote in message
news:eACFDzamEHA.1904@.TK2MSFTNGP09.phx.gbl...
Hi ,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ' anything ...
Thanks,
Showing posts with label old. Show all posts
Showing posts with label old. Show all posts
Tuesday, March 27, 2012
data extraction/mapping between two databases
Hi ,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ? anything ...
Thanks,
I'd just understand the data, and then populate tables manually, using
INSERT statements, to populate the primary key and foreign key tables.
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"Mani" <mani@.spamthis@.evikasystems.com> wrote in message
news:eACFDzamEHA.1904@.TK2MSFTNGP09.phx.gbl...
Hi ,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ? anything ...
Thanks,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ? anything ...
Thanks,
I'd just understand the data, and then populate tables manually, using
INSERT statements, to populate the primary key and foreign key tables.
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"Mani" <mani@.spamthis@.evikasystems.com> wrote in message
news:eACFDzamEHA.1904@.TK2MSFTNGP09.phx.gbl...
Hi ,
just wondering, whats the easiest way to map about 10000 rows from database
a (old) to database b.
the complexity is that the structure is completely different, database b is
more normalized. has relationships maintained..where as database a has
everything in one table ..So I have to map one column (choose distinct rows)
and move to datbase b.new table...
I did try DTS for couple of them..but seems tedious and never ending..
Whats the easiest way to do this? any OR tools ? anything ...
Thanks,
Wednesday, March 7, 2012
Data Archiving
Production data is accumulating, I would like to archive old data.
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any help
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>
|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.g bl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>
|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com>
> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.g bl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any help
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>
|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.g bl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>
|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com>
> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.g bl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>
Data Archiving
Production data is accumulating, I would like to archive old data.
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any helpI suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Díaz" <rdiazconcha@.hotmail.com.anti-spam.com> ¼¶¼g©ó¶l¥ó·s»D:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Díaz" <rdiazconcha@.hotmail.com.anti-spam.com>
> ¼¶¼g©ó¶l¥ó·s»D:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>> Production data is accumulating, I would like to archive old data.
>> I want the archive database has the whole and complete data for READ ONLY
>> (historical reporting, audit checking, etc).
>> Then production database becomes lightweight and able to perform update
>> efficiently.
>> Without replication, what do you suggest? Thanks for any help
>
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any helpI suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Díaz" <rdiazconcha@.hotmail.com.anti-spam.com> ¼¶¼g©ó¶l¥ó·s»D:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Díaz" <rdiazconcha@.hotmail.com.anti-spam.com>
> ¼¶¼g©ó¶l¥ó·s»D:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>> Production data is accumulating, I would like to archive old data.
>> I want the archive database has the whole and complete data for READ ONLY
>> (historical reporting, audit checking, etc).
>> Then production database becomes lightweight and able to perform update
>> efficiently.
>> Without replication, what do you suggest? Thanks for any help
>
Data Archiving
If I have a huge database, I use the data archiving to archieve to table. eg. I have table A, a huge table, I archive some of the old data to table B. Let said my data is some transaction data. When I would like to retrieve it to generate a report, a historical report for my yearly transactions. Let said I've split the March 2003 data from table A into table B. which mean only April to July transactions remain in table A. But now I want to retrieve the data from Jannuary till now on, to generate a half year report. How should I do it?? Should I just using a SQL query to select data from table A where date from Jan to July, then the system will automatically look back for the data that have been archived into table B and retrieve it for me?? Or I should use 2 seperate queries to select it from table A (Apr-July) and table B (Jan-Mar)?esupport,
you could create a view like the following:
create View MyData as
select * from MyOldData
union all
select * from MyActualData
go
and to query from this view instead of querying from thge tables.|||If using View instead of querying from table, will this affect my performance or my database size? will it slowing down my system performance? Because normally my database will be quite big in size. 2-3 GB. Is this the optimum solution to solve the problem?|||Your other alternative is to maintain your current stragey (if it's working) and possibly create an aggregate table. This will only work in some circumstances, it depends on the needs of your business.
For instance if table A and B contains individual transactions (lets say 100,000 a day or something). However your reports only ever report at a daily level, then once a day summerise the data:
select date, sum(sales)
from table A
group by date
Load that summerised data into your aggregate table and you should be good to go..
This would cut down the table size of the table you are reporting from by n rows a day and still maintain all your transactions in your other table...
This is sort of moving into the data warehousing concepts and it may not work for you, but I'd just thought I'd suggest it..
Good luck.|||Thanks for the suggestions. It's help a lot.
BTW, is there any tools or built-in command to archive data from my production database into a history archive database in SQL Server 2000?|||Depending on how complex you want to get it might be possible to get away with just a stored procedure or 2... Otherwise the DTS tool might come in handy and if you want to get really serious, then Analysis Server if you have the licence and is appropriate etc...
Have Fun..|||If I'm using the UNION ALL suggested by kukuk
create View MyData as
select * from MyCurrentData
union all
select * from MyHistoryData
go
Let said I've archived my database till March into table MyHistoryData. Now I would like to retrieve some data for reporting purpose. I want to retrieve data from June to July, how would be the performance? How would the query performing? Will it "search" through all the data in table MyCurrentData and MyHistoryData?
How's the performing comparison by using a single query "SELECT * FROM MyCurrentData" compare with using UNION ALL as the example given above to retrieve data from 2 tables?|||Hi,
I think, BOL will answer your questions.
Look there for "Using Partitioned Views"|||Another question from me.
What is the size of the database table after achiving?
Will it become smaller or same size?
Let said my database before archive is 2GB, after archive, willl it become smaller? less than 2GB?|||Any ideas on back-end processing?
Lets say I'm doing retrieving data history for reporting. In the mean time, I would like to continue with my front-end processing, such as daily transactions?
In my current system, while I'm doing my data retrieve, I cannot do anything before the retrieving process finished. One of the issue I think is because of the size of my database and the technic that I'm using to retrieve the data is bad. Hope to hear more advices.
you could create a view like the following:
create View MyData as
select * from MyOldData
union all
select * from MyActualData
go
and to query from this view instead of querying from thge tables.|||If using View instead of querying from table, will this affect my performance or my database size? will it slowing down my system performance? Because normally my database will be quite big in size. 2-3 GB. Is this the optimum solution to solve the problem?|||Your other alternative is to maintain your current stragey (if it's working) and possibly create an aggregate table. This will only work in some circumstances, it depends on the needs of your business.
For instance if table A and B contains individual transactions (lets say 100,000 a day or something). However your reports only ever report at a daily level, then once a day summerise the data:
select date, sum(sales)
from table A
group by date
Load that summerised data into your aggregate table and you should be good to go..
This would cut down the table size of the table you are reporting from by n rows a day and still maintain all your transactions in your other table...
This is sort of moving into the data warehousing concepts and it may not work for you, but I'd just thought I'd suggest it..
Good luck.|||Thanks for the suggestions. It's help a lot.
BTW, is there any tools or built-in command to archive data from my production database into a history archive database in SQL Server 2000?|||Depending on how complex you want to get it might be possible to get away with just a stored procedure or 2... Otherwise the DTS tool might come in handy and if you want to get really serious, then Analysis Server if you have the licence and is appropriate etc...
Have Fun..|||If I'm using the UNION ALL suggested by kukuk
create View MyData as
select * from MyCurrentData
union all
select * from MyHistoryData
go
Let said I've archived my database till March into table MyHistoryData. Now I would like to retrieve some data for reporting purpose. I want to retrieve data from June to July, how would be the performance? How would the query performing? Will it "search" through all the data in table MyCurrentData and MyHistoryData?
How's the performing comparison by using a single query "SELECT * FROM MyCurrentData" compare with using UNION ALL as the example given above to retrieve data from 2 tables?|||Hi,
I think, BOL will answer your questions.
Look there for "Using Partitioned Views"|||Another question from me.
What is the size of the database table after achiving?
Will it become smaller or same size?
Let said my database before archive is 2GB, after archive, willl it become smaller? less than 2GB?|||Any ideas on back-end processing?
Lets say I'm doing retrieving data history for reporting. In the mean time, I would like to continue with my front-end processing, such as daily transactions?
In my current system, while I'm doing my data retrieve, I cannot do anything before the retrieving process finished. One of the issue I think is because of the size of my database and the technic that I'm using to retrieve the data is bad. Hope to hear more advices.
Data Archiving
Production data is accumulating, I would like to archive old data.
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any helpI suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com> glsD:umoLiK
XTGHA.1492@.TK2MSFTNGP10.phx.gbl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com>
> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>
I want the archive database has the whole and complete data for READ ONLY
(historical reporting, audit checking, etc).
Then production database becomes lightweight and able to perform update
efficiently.
Without replication, what do you suggest? Thanks for any helpI suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||Thanks for suggestion, but could DTS determine those existing rows that is
modified?
"Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com> glsD:umoLiK
XTGHA.1492@.TK2MSFTNGP10.phx.gbl...
I suggest a DTS task that runs over night and copies all-day new rows to the
backup database, so you can query and update the production database
efficiently. Also I suggest another server to host the backup database.
Hope this help.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
> Production data is accumulating, I would like to archive old data.
> I want the archive database has the whole and complete data for READ ONLY
> (historical reporting, audit checking, etc).
> Then production database becomes lightweight and able to perform update
> efficiently.
> Without replication, what do you suggest? Thanks for any help
>|||No it could not, you need to flag these updated rows maybe with a column
with datetime datatype. Another solution that comes to my mind is using an
update/insert trigger to insert those rows to the backup database... in
this case you won't need DTS.
"Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
news:%231GaKwXTGHA.4616@.TK2MSFTNGP10.phx.gbl...
> Thanks for suggestion, but could DTS determine those existing rows that is
> modified?
>
>
> "Rodrigo Daz" <rdiazconcha@.hotmail.com.anti-spam.com>
> glsD:umoLiKXTGHA.1492@.TK2MSFTNGP10.phx.gbl...
> I suggest a DTS task that runs over night and copies all-day new rows to
> the
> backup database, so you can query and update the production database
> efficiently. Also I suggest another server to host the backup database.
> Hope this help.
> "Need Help" <DONTSPAM@.DONTSPAM.com> wrote in message
> news:%23rz8ERWTGHA.5656@.TK2MSFTNGP11.phx.gbl...
>
Sunday, February 19, 2012
daily transfer from oracle schema to similar SQL-Server
I have some small (<5000 records) Oracle tables that I need to transfer daily to SQL-Server.
The old legacy standalone Oracle system here in Hong Kong must still be used to enter property data, so I cannot change it. The idea is to get this data in every day into an SQL-Server. I am sure they can be connected over the company's LAN if not so already. The aim is to then put together a basic front-end for this data. Both systems although fairly similar (property web site data), their schemas are fixed and the SQL-Server schema is in use by another branch of the company for a U.K. property site. The Hong Kong team want me to get their legacy data in to this new system every day and then do a similar web site for their HK data.
Is there a way where I can, say automate DTS to run every day on the SQL-Server machine, get the Oracle tables over the LAN to SQL-Server, then write scheduled stored procedures that massage the data from one schema to the other (I will have to lose a bit of data, since they are not exactly the same tables and fields, though with property data, it is all pretty much the same).
I once did this using ASP (after getting the data into a CSV file from an ACCESS database). The script looped through each line of data and put it into the relevant table.
Anyone got any ideas? Much appreciated. Itry using a linked server to transfer the data from the oracle server. It is much more faster . Write a stored procedure for the same and then schedule it as a job|||Originally posted by Enigma
try using a linked server to transfer the data from the oracle server. It is much more faster . Write a stored procedure for the same and then schedule it as a job
Thanks. I will meet their IT admin guy next week. So, I will have an Oracle DB and a SQL-Server DB. I am familiar with Enterprise manager, I can add remote SQL-Server groups. How would I go about setting up a 'Link Server'. In my main job, we use Lotus Enterprise Integrator (LEI), a handy tool that transfers data from Lotus Notes to anything else, but not sure of a tool for my needs.
I'm more from a programming background. Can you explain the Link Server setup. The scheduled SP's I've done before.
Many Thanks|||You can look in books online and search for "linked server". Also, in the enterprise manager under the security directory are the linked servers visible.|||Originally posted by jora
You can look in books online and search for "linked server". Also, in the enterprise manager under the security directory are the linked servers visible.
Thanks guys, I've just trawled though a few of these pages. Seems DTS is a safe option. I've just started on Enterprise manager, trying to create a package in the meantime to transfer data every day from some test SQL table from one DB to another DB on our sql-server. Looks like I should go over to the client next week, and setup a DTS package on their SQL-Server, try and connect to their Oracle tables and go from there.
I'll check ot the LINK server, I'll need to connect to their legacy Server
Cheers
The old legacy standalone Oracle system here in Hong Kong must still be used to enter property data, so I cannot change it. The idea is to get this data in every day into an SQL-Server. I am sure they can be connected over the company's LAN if not so already. The aim is to then put together a basic front-end for this data. Both systems although fairly similar (property web site data), their schemas are fixed and the SQL-Server schema is in use by another branch of the company for a U.K. property site. The Hong Kong team want me to get their legacy data in to this new system every day and then do a similar web site for their HK data.
Is there a way where I can, say automate DTS to run every day on the SQL-Server machine, get the Oracle tables over the LAN to SQL-Server, then write scheduled stored procedures that massage the data from one schema to the other (I will have to lose a bit of data, since they are not exactly the same tables and fields, though with property data, it is all pretty much the same).
I once did this using ASP (after getting the data into a CSV file from an ACCESS database). The script looped through each line of data and put it into the relevant table.
Anyone got any ideas? Much appreciated. Itry using a linked server to transfer the data from the oracle server. It is much more faster . Write a stored procedure for the same and then schedule it as a job|||Originally posted by Enigma
try using a linked server to transfer the data from the oracle server. It is much more faster . Write a stored procedure for the same and then schedule it as a job
Thanks. I will meet their IT admin guy next week. So, I will have an Oracle DB and a SQL-Server DB. I am familiar with Enterprise manager, I can add remote SQL-Server groups. How would I go about setting up a 'Link Server'. In my main job, we use Lotus Enterprise Integrator (LEI), a handy tool that transfers data from Lotus Notes to anything else, but not sure of a tool for my needs.
I'm more from a programming background. Can you explain the Link Server setup. The scheduled SP's I've done before.
Many Thanks|||You can look in books online and search for "linked server". Also, in the enterprise manager under the security directory are the linked servers visible.|||Originally posted by jora
You can look in books online and search for "linked server". Also, in the enterprise manager under the security directory are the linked servers visible.
Thanks guys, I've just trawled though a few of these pages. Seems DTS is a safe option. I've just started on Enterprise manager, trying to create a package in the meantime to transfer data every day from some test SQL table from one DB to another DB on our sql-server. Looks like I should go over to the client next week, and setup a DTS package on their SQL-Server, try and connect to their Oracle tables and go from there.
I'll check ot the LINK server, I'll need to connect to their legacy Server
Cheers
Subscribe to:
Posts (Atom)