Thursday, March 29, 2012
Data File Remaining Space
there a stored procedure or script available that I can schedule or run
when I want manually that will give me the space remaining in the data file?
--
Message posted via http://www.sqlmonster.comCheckout sp_spaceused in BOL.
--
Paul Randal
Dev Lead, Microsoft SQL Server Storage Engine
This posting is provided "AS IS" with no warranties, and confers no rights.
"Robert Richards via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:be0b5b63225d4993937c2075fbb4510f@.SQLMonster.com...
> I am running SQL 2000. I have a data file set up with restricted growth.
Is
> there a stored procedure or script available that I can schedule or run
> when I want manually that will give me the space remaining in the data
file?
> --
> Message posted via http://www.sqlmonster.com|||sp_helpdb YourDBNameHere
gives some information regarding database and file size.
--
Keith
"Robert Richards via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:be0b5b63225d4993937c2075fbb4510f@.SQLMonster.com...
> I am running SQL 2000. I have a data file set up with restricted growth.
Is
> there a stored procedure or script available that I can schedule or run
> when I want manually that will give me the space remaining in the data
file?
> --
> Message posted via http://www.sqlmonster.comsql
Tuesday, March 27, 2012
Data file inlezen via Stored Procedure
I want to import a datafile into an sql database with stored procedure. But i have now idee how i must do that.
Also the procedure must be run automatily each morning at 10 a clock.
Greetz TomCreate stored procedure for importing data and setup job for running this procedure.
Data File Growth During Re-Index
I have a 100GB SQL Server database and when I run a Database Re-Index the
data file (mdf) increases in size from 100GB to 190GB.
Is this normal and if not how do I go about resolving the issue as once the
re-index is finished there is ~90GB of free space in the database.
Thanks
- David T
- David TDavid
1) SQL Server 2005
ALTER INDEX..... there us ONLINE option too see BOL for details
2) SQL Server 2000
Don't run this command on all tables. Identify heavy fragmented tables and
then run DBCC
"David" <David@.discussions.microsoft.com> wrote in message
news:E00AC669-B2C3-4324-B082-4844771F9B93@.microsoft.com...
> Hi All
> I have a 100GB SQL Server database and when I run a Database Re-Index the
> data file (mdf) increases in size from 100GB to 190GB.
> Is this normal and if not how do I go about resolving the issue as once
> the
> re-index is finished there is ~90GB of free space in the database.
> Thanks
> - David T
> - David T|||Also if 2005 check the SORT_IN_TEMPDB option of Alter Index. Sort operations
are performed in tempdb, which does cause tempdb to grow so consider this
also.
--
Adam J Warne, MCDBA
"David" wrote:
> Hi All
> I have a 100GB SQL Server database and when I run a Database Re-Index the
> data file (mdf) increases in size from 100GB to 190GB.
> Is this normal and if not how do I go about resolving the issue as once the
> re-index is finished there is ~90GB of free space in the database.
> Thanks
> - David T
> - David T|||Having lots of free space is good for a db so that when you perform
operations such as these there is enough room to do it's job properly. When
you reindex it creates a new copy of all the indexes before dropping and
renaming the existing ones. So you need room for this to happen. Twice the
space is usually not required although it won't hurt anything. I suspect
your autogrow is set to a high percentage. Most dbs don't need all indexes
rebuilt each time so you may want to consider only rebuilding ones that are
fairly fragmented. See DBCC SHOWCONTIG in BooksOnLine for a sample script to
defrag based on %.
--
Andrew J. Kelly SQL MVP
Solid Quality Mentors
"David" <David@.discussions.microsoft.com> wrote in message
news:E00AC669-B2C3-4324-B082-4844771F9B93@.microsoft.com...
> Hi All
> I have a 100GB SQL Server database and when I run a Database Re-Index the
> data file (mdf) increases in size from 100GB to 190GB.
> Is this normal and if not how do I go about resolving the issue as once
> the
> re-index is finished there is ~90GB of free space in the database.
> Thanks
> - David T
> - David T|||It is important to understand what these commands do, in order to understand the ramifications of
running the commands. You don't mention what version of SQL Server you have, so I'll assume 2005 in
the commands I mention below. The same principle holds for 2000, but the commands names are
different.
ALTER INDEX ... REBUILD:
This will create a new index internally, and *after that has been done* the old one is removed. So
if you have one large table in the database, consuming say close to 90 GB and that table has a
clustered index, then what you see is expected.
As suggested:
Don't reindex if you don't need to. Base you reindexing on fragmentation level and whether you
actually see enough performance improvements from the reindex.
Also, REORGANIZE instead of REBUILD does something different and does not require as much working
space in the database.
Check out http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/ss2kidbp.mspx for more
information (as well as Books Online of course).
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
"David" <David@.discussions.microsoft.com> wrote in message
news:E00AC669-B2C3-4324-B082-4844771F9B93@.microsoft.com...
> Hi All
> I have a 100GB SQL Server database and when I run a Database Re-Index the
> data file (mdf) increases in size from 100GB to 190GB.
> Is this normal and if not how do I go about resolving the issue as once the
> re-index is finished there is ~90GB of free space in the database.
> Thanks
> - David T
> - David T
Data Field not Truncating
I cannot reposition this field so it won't interrupt anything else.
Thanks for any help.If the filed has too many characters without a space this may happen. To avoid this create a formula @.MyField having the code
mid(Field,1,20)So only 20 charaters will be printed
Sunday, March 25, 2012
data export to CSV
export to CSV and is scheduled to run weekly.
Sometimes the report is needed to run manually. Initially this report
timed out, now the report runs but takes a very long time to complete.
If the stored procedure is run directly it takes about 5 mins to
execute. Running in reporting services takes upward of an hour.
Does anyone have suggestions on how to speed up this report.
my initial thoughts would be to use SSRS to populate the parameters
and maybe hook into the reporting engine and execute the stored
procedure directly with the parameters defined
thanks in advance
IanHi, how run this report? http? web service? how?
--
----
Microsoft M.V.P en SQLServer
SQLTotal Consulting - Servicios en SQLServer
Email: maxi.da@.gmail.com.nospam
----
<emery.ian@.googlemail.com> escribió en el mensaje
news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>I have a report that produces over 3 million rows, this report is
> export to CSV and is scheduled to run weekly.
> Sometimes the report is needed to run manually. Initially this report
> timed out, now the report runs but takes a very long time to complete.
> If the stored procedure is run directly it takes about 5 mins to
> execute. Running in reporting services takes upward of an hour.
> Does anyone have suggestions on how to speed up this report.
> my initial thoughts would be to use SSRS to populate the parameters
> and maybe hook into the reporting engine and execute the stored
> procedure directly with the parameters defined
>
> thanks in advance
> Ian|||Do not waste your time trying to run the report different ways, it will not
affect the time.
Your time is coming in two places. One, does it really only take 5 minutes
to extract 5 million records? I'm not sure how it works when executing a
stored procedure from Query Analyzer that returns 5 million records, whether
it really retrieves all the records or not. Anyway, I think that is one
area, the number of records being physically returned takes time.
Second, RS 2000 and RS 2005 does all rendering in RAM. It does not page
anything out to disk or utilize the disk at all when rendering. I am pretty
sure this is changing substantially in RS 2008. Adding more RAM to the
server will help.
The other option is to use SSIS instead.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<emery.ian@.googlemail.com> wrote in message
news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>I have a report that produces over 3 million rows, this report is
> export to CSV and is scheduled to run weekly.
> Sometimes the report is needed to run manually. Initially this report
> timed out, now the report runs but takes a very long time to complete.
> If the stored procedure is run directly it takes about 5 mins to
> execute. Running in reporting services takes upward of an hour.
> Does anyone have suggestions on how to speed up this report.
> my initial thoughts would be to use SSRS to populate the parameters
> and maybe hook into the reporting engine and execute the stored
> procedure directly with the parameters defined
>
> thanks in advance
> Ian|||An additional point, BCP is the tool to use (not SSIS and not RS) if
performance is your goal. Give a DBA your query used by the report and they
could have BCP written and scripted for you in no time. It's been over a
decade since I used BCP and I had forgotten about it.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
news:eES$LTZPIHA.4272@.TK2MSFTNGP06.phx.gbl...
> Do not waste your time trying to run the report different ways, it will
> not affect the time.
> Your time is coming in two places. One, does it really only take 5 minutes
> to extract 5 million records? I'm not sure how it works when executing a
> stored procedure from Query Analyzer that returns 5 million records,
> whether it really retrieves all the records or not. Anyway, I think that
> is one area, the number of records being physically returned takes time.
> Second, RS 2000 and RS 2005 does all rendering in RAM. It does not page
> anything out to disk or utilize the disk at all when rendering. I am
> pretty sure this is changing substantially in RS 2008. Adding more RAM to
> the server will help.
> The other option is to use SSIS instead.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> <emery.ian@.googlemail.com> wrote in message
> news:e977052b-81b7-404f-88e1-d14968fbe0ab@.e25g2000prg.googlegroups.com...
>>I have a report that produces over 3 million rows, this report is
>> export to CSV and is scheduled to run weekly.
>> Sometimes the report is needed to run manually. Initially this report
>> timed out, now the report runs but takes a very long time to complete.
>> If the stored procedure is run directly it takes about 5 mins to
>> execute. Running in reporting services takes upward of an hour.
>> Does anyone have suggestions on how to speed up this report.
>> my initial thoughts would be to use SSRS to populate the parameters
>> and maybe hook into the reporting engine and execute the stored
>> procedure directly with the parameters defined
>>
>> thanks in advance
>> Ian
>
Thursday, March 22, 2012
data driven subscription with special charater in parameter
I have a report which takes in a name as the parameter. When I run the
report from the server manually using the name Monzón it works fine. But when
that same report runs as part of a data driven subscription, the report comes
out blank. The data source is a sql table.
Any ideas?
Thanks,
MarcusOn Oct 19, 4:05 pm, Marcus K <Marc...@.discussions.microsoft.com>
wrote:
> Hi all,
> I have a report which takes in a name as the parameter. When I run the
> report from the server manually using the name Monz=F3n it works fine. Bu=t when
> that same report runs as part of a data driven subscription, the report c=omes
> out blank. The data source is a sql table.
> Any ideas?
> Thanks,
> Marcus
You might want to check the collation on the database that the table
resides in. I don't believe that the default collation accommodates
espa=F1ol/etc. This link should help.
http://msdn2.microsoft.com/en-us/library/ms190920.aspx
Regards,
Enrique Martinez
Sr. Software Consultant|||On Oct 19, 4:05 pm, Marcus K <Marc...@.discussions.microsoft.com>
wrote:
> Hi all,
> I have a report which takes in a name as the parameter. When I run the
> report from the server manually using the name Monz=F3n it works fine. Bu=t when
> that same report runs as part of a data driven subscription, the report c=omes
> out blank. The data source is a sql table.
> Any ideas?
> Thanks,
> Marcus
You might want to change the collation of the column in the table that
contains Monz=F3n.
This link should help.
http://msdn2.microsoft.com/en-us/library/ms190920.aspx
Regards,
Enrique Martinez
Sr. Software Consultant|||The Collation is SQL_Latin1_General_CP1P_CI_AS, it accepts the ó and returns
it in a query.
BTW. This is not the same server that is running the SRS subscription.
This seems directly related to data driven subscriptions.
Marcus
"EMartinez" wrote:
> On Oct 19, 4:05 pm, Marcus K <Marc...@.discussions.microsoft.com>
> wrote:
> > Hi all,
> >
> > I have a report which takes in a name as the parameter. When I run the
> > report from the server manually using the name Monzón it works fine. But when
> > that same report runs as part of a data driven subscription, the report comes
> > out blank. The data source is a sql table.
> >
> > Any ideas?
> >
> > Thanks,
> > Marcus
>
> You might want to check the collation on the database that the table
> resides in. I don't believe that the default collation accommodates
> español/etc. This link should help.
> http://msdn2.microsoft.com/en-us/library/ms190920.aspx
> Regards,
> Enrique Martinez
> Sr. Software Consultant
>
Data Driven Subscription Help
Hello all,
I am trying to schedule an MDX report to run once a week and it only works if I pass one parameter per field.It would fail every time I try to pass multiple parameters.Do you know if there is way to pass multiple MDX parameters?
The parameter field is set as multi-value field.
This worked –
'[Dim Travel Product].[Dim Travel Product].&[67]'
This failed –
'[Dim Travel Product].[Dim Travel Product].&[67], [Dim Travel Product].[Dim Travel Product].&[70], [Dim Travel Product].[Dim Travel Product].&[69]'
This failed -
'[Dim Travel Product].[Dim Travel Product].&[67]&[69]&[70]'
Could you post the MDX statement into which the parameters are being passed?
Thanks,
Bryan
The parameter is passed to @.TravelProduct:
SET [Dim Travel Product Set] AS StrToSet(@.TravelProduct)
MEMBER [Dim Travel Product].[Dim Travel Product].[Travel Product Subset] AS 'Aggregate([Dim Travel Product Set])'
Thanks in advance!
|||It also works if i were to run this manually and select multiple parameters. The problem is I don't know the proper syntax of what's being passed when there are multiple selections.
|||
Enclose the list of members inside { } when passing multiple members.
|||None of the below syntax works:
'{[Dim Travel Product].[Dim Travel Product].&[4]&[5]&[7]}'
'{[Dim Travel Product].[Dim Travel Product].&[4], [Dim Travel Product].[Dim Travel Product].&[5], [Dim Travel Product].[Dim Travel Product].&[7]}'
This is the method that I am using to pass the parameter:
So this is still working:
SELECT CONVERT(char(10),DateAdd(dd, -1 - (DatePart(dw, getdate()) - 2), getdate()),101) as Sunday,
'[Dim Travel Product].[Dim Travel Product].&[4]' as TravProd,
'[Dim Car Vendor].[Car Vendor Desc].[All]' as Vendor
This does not work:
SELECT CONVERT(char(10),DateAdd(dd, -1 - (DatePart(dw, getdate()) - 2), getdate()),101) as Sunday,
'{[Dim Travel Product].[Dim Travel Product].&[4], [Dim Travel Product].[Dim Travel Product].&[5], [Dim Travel Product].[Dim Travel Product].&[7]}' as TravProd,
'[Dim Car Vendor].[Car Vendor Desc].[All]' as Vendor
This does not work:
SELECT CONVERT(char(10),DateAdd(dd, -1 - (DatePart(dw, getdate()) - 2), getdate()),101) as Sunday,
'{[Dim Travel Product].[Dim Travel Product].&[4]&[5]&[7]}' as TravProd,
'[Dim Car Vendor].[Car Vendor Desc].[All]' as Vendor
Take a look at this. It works against Adventure Works:
Code Snippet
select
[Date].[Date].[July 1, 2003] on 0,
strtoset("{[Product].[Category].[Bikes],[Product].[Category].[Clothing]}") on 1
from [Adventure Works]
B.|||
What I am trying to do is pass the @.TravelProduct from a Data Driven Subscription in Reporting Services and schedule it to run once a week. The select statement that I built contains all my filters, but when I add more than one filter for a particular field, it errors.
Thanks!
|||What my code illustrates is how the MDX must be constructed to support multiple values from a parameter. You need to use STRTOSET, you need to wrap that comma delimited list of set members in curly braces, and that set string needs to be wrapped in double-quotes. If you've got all that in place, you can use a parameter in SSRS supply the set of members. The trick is getting your MDX in order to be able to work with the value from the parameter.
B.
|||Thanks! I'll try that later today and report back.
|||I tried what Bryan did above but it is still not working. ;(
sqlData Driven Subscription - Will the Report run for each parameter ?
Hi,
I have a report which has 200 parameters to be passed for which I am using Data driven subsription. This report takes 10 minutes to run so does this mean that this report would run 200 times or will it generate the report once and then burst the report based on parameters.
Any help is appreciated.
Adarsh
Thanks!! Can you specify how it would work in this case ?
AM
|||Let say you have a report (Report1) with 3 parameters (Name, ID, Region) and one data driven subscription on top of it with the follwoing parameter values:
Name=abcde
ID=000
Region=xyz
The subscription will run the report with that param values.
I hope that helps.
Data Driven Subscription - Open Connections
Hi,
I'm trying to run a data driven subscription on a report in the rs2005,
and using a data source to oracle,
I found out that when the subscription runs it opens a connection to oracle for each report and it means that i have more than fifteen open connections every time which causes a performance problems
did anyone encounter this kind of problem
pls help
Data driven subscriptions runs the report for each row you return in your delivery query.
You can minimze the impact on the underlying RDBMS by changing the execution mode of the report to cache or snapshot. The first will ensure the minimal number of query executions occur on your underlying RDBMS. The second option will ensure only one query execution will occur (whenever the snapshot is updated). This has some benefits, but may require you to update your report to filter content for each user.
Hope that helps,
-Lukasz
|||Is this report going against transaction tables, is that why there is a performance problem? If for some reason you can't do what Lukasz suggests maybe you can snapshot or cache the data yourself (whether in Oracle or an external mart somewhere)?
This way you massage the data once up front into the form your reports need, take the load off your transaction tables, even avoid any potential licensing issues on the Oracle side if that has been a concern with these connections.
There are probably a lot of ways to skin this particular cat <s>.
>L<
Wednesday, March 21, 2012
data doesn't be transfered to one of my subscribers!
I use merge replication and the topology is central publication. I have 20 subscribers.
when I run agent, I got message "No data needed to be merged". Publisher doesn't send data to this subscriber. When I update date again, I can send data to subscriber.
Why?!
How replication works?!
and How specifies records that must be send to the subscriber?!
Publisher and subscriber work with SQL 2000 with sp 3 installed on windows server 2003.
Please help me.
Thanks.
It's not clear what you're expecting or what the problem is.
To learn about replication, you should start by reading Replication topics in Books Online, then come back with any specific questions.
|||I guess you have setup merge replication using "NoSync" option. See http://msdn2.microsoft.com/en-us/library/aa239398(SQL.80).aspx.
Did you see convergent data after the first sync?
Thanks.
This posting is provided AS IS with no warranties, and confers no rights.
sql
data doesn't be transfered to one of my subscribers!
I use merge replication and the topology is central publication. I have 20 subscribers.
when I run agent, I got message "No data needed to be merged". Publisher doesn't send data to this subscriber. When I update date again, I can send data to subscriber.
Why?!
How replication works?!
and How specifies records that must be send to the subscriber?!
Publisher and subscriber work with SQL 2000 with sp 3 installed on windows server 2003.
Please help me.
Thanks.
It's not clear what you're expecting or what the problem is.
To learn about replication, you should start by reading Replication topics in Books Online, then come back with any specific questions.
|||I guess you have setup merge replication using "NoSync" option. See http://msdn2.microsoft.com/en-us/library/aa239398(SQL.80).aspx.
Did you see convergent data after the first sync?
Thanks.
This posting is provided AS IS with no warranties, and confers no rights.
Monday, March 19, 2012
Data Convertion Error
I am using SQL server 2005, Visual Web Developer 2005 express (for right now). Can get the stored procedure to run fine if I do not return the CityID.
Stored Procedure
ALTER Procedure [dbo].[WDR_CityAdd1]
(
@.CountryID int,
@.CityName nvarchar(50),
@.InternetAvail bit,
@.FlowersAvail bit,
@.CityID int OUTPUT
)
AS
IF EXISTS(SELECT 'True' FROM city WHERE CityName = @.CityName AND CountryID = @.CountryID)
BEGIN
SELECT
@.CityID = 0
END
ELSE
BEGIN
INSERT INTO City
(
CountryID,
CityName,
InternetAvail,
FlowersAvail
)
VALUES
(
@.CountryID,
@.CityName,
@.InternetAvail,
@.FlowersAvail
)
SELECT
@.CityID = 'CityID' ( I have also tried = @.@.Identity but that never returned anything it is an identity column 1,1)
END
Here is the code on the other end. I have not included all the parameters, but should get a sense of what I am doing wrong.
Dim myCommand As New SqlCommand("WDR_CityAdd1", dbConn)
myCommand.CommandType = CommandType.StoredProcedure
Dim parameterCityName As New SqlParameter("@.CityName", SqlDbType.NVarChar, 50)
parameterCityName.Value = CityName
myCommand.Parameters.Add(parameterCityName)
Dim parameterCityID As New SqlParameter("@.CityID", SqlDbType.Int, 4)
parameterCityID.Direction = ParameterDirection.Output
myCommand.Parameters.Add(parameterCityID)
Try
dbConn.Open()
myCommand.ExecuteNonQuery()
dbConn.Close()
Return (parameterCityID.Value).ToString (have played with this using the .tostring and not using it)
'AlertMessage = "City Added"
Catch ex As Exception
AlertMessage = ex.ToString
End Try
Here is the error I get. So what am I doing wrong? I figured maybe in the stored procedure. CityID is difined in the table as an int So why is it telling me that it is a varchar when it is defined in the stored procedure, table and code as an int? Am I missing something?
System.Data.SqlClient.SqlException: Conversion failed when converting the varchar value 'CityID' to data type int. at System.Data.SqlClient.SqlConnection.OnError
Thanks
Jerry
There are a couple of issues:
grbourque wrote:
...
SELECT @.CityID = 'CityID'
...
Return (parameterCityID.Value).ToString (have played with this using the .tostring and not using it)
...
--> Conversion failed when converting the varchar value 'CityID' to data type int.
In this location,
SELECT @.CityID = 'CityID'
You are attempting to assign the @.CityID (an integer) the string value 'CityID'. That is what caused the error you reported.
I suggest using SCOPE_IDENTITY to capture the IDENTITY value of the last row entered by the current user/session.
SET @.CityID = SCOPE_IDENTITY()
Also, the RETURN value 'should' be an integer, and you are attempting to return a string value.
('Normally', one would use the RETURN value for success/failure reporting.)
Somewhere prior to the RETURN statement, you 'should' assign the parameterCityID value to a previously declared variable, then use that variable in your application.
|||Arnie
I had tried playing around with the 'cityID' fieldname and was getting different error messages. I have re-written the code with your help and it works. Sometimes this just amazes me. I do appreciate the help. Maybe with time I will be able to help others as I get a handle on this.
Jerry
Friday, February 24, 2012
Damn Pesky ' RED X'
Running SQL 2000 SP3 & Win 2000 Adv Srv.
I cannot get rid of the Red X that appears in my replication monitor.
I have run 'sp_MSload_replication_status' to no avail.
HELP!!!
How can I get rid of this?
The X appears on the Replication
Monitor->Publishers->servername->Publication, but there is no problem
with the Snapshot, log Reader or the Distribution Agent.
The X appears on the Replication Monitor->Agents->Distribution Agents
folder, but there is no errors with any of the distribution agents.
My SQL servers get restarted nightly, but the servers are not
restarted. Manually restarting the SQL server do not fix this either.
Any help will be appreciated.
Larry...
Try restarting the sql server service if all else fails. Usually it results
from an out-of-date error in tempdb (assuming it is not a genuine error

Rgds,
Paul Ibison
<lreames@.gmail.com> wrote in message
news:1109791850.594371.167360@.l41g2000cwc.googlegr oups.com...
> How...Why...
> Running SQL 2000 SP3 & Win 2000 Adv Srv.
> I cannot get rid of the Red X that appears in my replication monitor.
> I have run 'sp_MSload_replication_status' to no avail.
> HELP!!!
> How can I get rid of this?
>
> The X appears on the Replication
> Monitor->Publishers->servername->Publication, but there is no problem
> with the Snapshot, log Reader or the Distribution Agent.
> The X appears on the Replication Monitor->Agents->Distribution Agents
> folder, but there is no errors with any of the distribution agents.
> My SQL servers get restarted nightly, but the servers are not
> restarted. Manually restarting the SQL server do not fix this either.
> Any help will be appreciated.
> Larry...
>
|||have you enabled show anonymous subscriptions? Right click on Replication
Monitor to do this.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
<lreames@.gmail.com> wrote in message
news:1109791850.594371.167360@.l41g2000cwc.googlegr oups.com...
> How...Why...
> Running SQL 2000 SP3 & Win 2000 Adv Srv.
> I cannot get rid of the Red X that appears in my replication monitor.
> I have run 'sp_MSload_replication_status' to no avail.
> HELP!!!
> How can I get rid of this?
>
> The X appears on the Replication
> Monitor->Publishers->servername->Publication, but there is no problem
> with the Snapshot, log Reader or the Distribution Agent.
> The X appears on the Replication Monitor->Agents->Distribution Agents
> folder, but there is no errors with any of the distribution agents.
> My SQL servers get restarted nightly, but the servers are not
> restarted. Manually restarting the SQL server do not fix this either.
> Any help will be appreciated.
> Larry...
>
Sunday, February 19, 2012
Daily scheduled job has its schedule disabled despite successful completion
I have other jobs configured in a similar way, but they are run and then rescheduled in the normal way without any problems.
What could be going on?I'm sure you'd know but there is definately no disables step in the job?
e.g
EXEC sp_update_job @.job_name = 'bla bla',
@.enabled = 0|||This may be of some use to you. Have you checked the logs?
http://support.microsoft.com/default.aspx?scid=kb;en-us;295378&Product=sql2k
Daily Caching
that takes 3 minutes to run.
I have it set up to run at 3 am and it works fine.
My question is when a user runs it at 6 am, does it use what I cached
plus the new data, or does it re-cache the whole thing?
This report is intense and I really do not want to run it during the day
if I don't have to.
If it constantly re-caches the report, would creating a snapshot on a
schedule be the better option?
Thanks.It depends on the execution settings of the report. If the report is set to
cache the execution and the DD subscription runs with all parameter
combinations, and the cache expiration is sufficiently long enough then
users will get cached renderings, they will not get any recent data.
Using a snapshot is another possibility, the only problem is that any
parameters that affect the query will be locked and users will not be able
to change them.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"John Geddes" <john_g@.alamode.com> wrote in message
news:uXbnj4a7EHA.2124@.TK2MSFTNGP15.phx.gbl...
>I have set up a data-drive subscription to do a null report on a report
>that takes 3 minutes to run.
> I have it set up to run at 3 am and it works fine.
> My question is when a user runs it at 6 am, does it use what I cached plus
> the new data, or does it re-cache the whole thing?
> This report is intense and I really do not want to run it during the day
> if I don't have to.
> If it constantly re-caches the report, would creating a snapshot on a
> schedule be the better option?
> Thanks.
>
Friday, February 17, 2012
Cyrstal Report count
SELECT TYPE, PART_NUM, COUNT(PART_NUM) AS Expr1
FROM dbo.ITEM
GROUP BY PART_NUM, TYPE
type part_num count
Monitor gx260 20
Monitor gx620 40You might need to use distinct count in Crystal.
GJ
Cyrillic problem with cr10 on XP and 2003
Tuesday, February 14, 2012
Customizing the report scheduling...
one of the requirements is that if the report takes a long time to run the user can start report processing and later
when that report is processed user can see that in his 'My reports' section,
What will be the best way to do this, my main concerns are -
1 How can I use scheduling to accomplish this.
2 How to save the report, as snappshot or what...
3. How can I find in my Web app. that report is ready or not and show a link to it.
There isn't an easy way to know how long a report will take to run if this is what you after. Other than that:
1. Sure but scheduled reports have limitations. Since they run in an unattended mode, they cannot use User!UserID and all parameters must have defaults.
2. I would see if subscribed delivery works for your users. In fact, your users can create their own subscriptions if they have the necessary rights.
3. Subscribed reports can be delivered via e-mail. You don't need to do anything. The user will automatically receive the report via e-mail.
|||Hi Teo,
Thanks for the reply,
1 What if we are not using the User!userId in report and all other parameters are passed while creating the schedule?
one more thing is while creating the schedule using web service (CreateSchedule) we can send a link of the report in email(without attaching the report);to what that link points to and cant we use that with ReportViewer control to show report to user.
2. I think after the schedule runs and finish processing report report service would be updating some table to show that this schedule is completed or not.
3. Just got an idea that, can i save the report in an ftp location through schedule and pick the report from there to show to user?
hope that makes sense....
|||1. The parameters needs to be set when creating the subscription not schedule. A schedule can be shared among subscriptions. As I mentioned, if you decide to use subcribed delivery, it will be the end user who will set the parameters. When the user clicks on the link, the report will be open in the ASP.NET ReportViewer control. In the case of a custom application, we address a similar requirement by having a custom job controller which would generate the report on the server as a snapshot and send the history ID to the application once the report is ready.
2. Again, there is a difference between a schedule and subscription (if this is what you would use). That said, you can use GetScheduleProperties API to get the last time the schedule is run.
3. A subscribed report can be delivered to a network share. Start creating a subscription and you will see that you have a choice between e-mail and network share delivery.
|||Teo Lachev wrote:
When the user clicks on the link, the report will be open in the ASP.NET ReportViewer control.
Did'nt got which link you are talking about...one sent in email? how come it will be opened in reportviewer?
Teo Lachev wrote:
In the case of a custom application, we address a similar requirement by having a custom job controller which would generate the report on the server as a snapshot and send the history ID to the application once the report is ready.
I think thats what I am looking for, can you provide some links for this or some sample app.
And by using CreateSubscription API i can set the schedule also or will need something else for that?
|||
Did'nt got which link you are talking about...one sent in email? how come it will be opened in reportviewer?
When you set up an e-mail subscription you can configure it to send the report link instead of the entire report. The report will open up in the ASP.NET report viewer. Try it out http://localhost/reportserver?/AdventureWorks%20Sample%20Reports/Company%20Sales&rs:Command=Render
I think thats what I am looking for, can you provide some links for this or some sample app.
In our case, we don't use subscriptions. Instead, the job service generates the report as a snapshot
if (parameters != null) // set report parameters {
managementProxy.SetReportParameters(reportPath, parameters);
}
// Create the report snapshot so the user can browse the report
managementProxy.UpdateReportExecutionSnapshot(reportPath);
// Check if the report is configured to keep snapshots in history
bool keepExecutionShapshots = false;
bool result = managementProxy.GetReportHistoryOptions(reportPath, out keepExecutionShapshots, out schedule);
if (keepExecutionShapshots)
{
// history is automatically created, get the list of history runs
ReportHistorySnapshot[] history = managementProxy.ListReportHistory(reportPath);
Array.Sort(history, CompareReportHistoryByDate); // need to sorty by date since history runs may not be chronologically sorted
historyID = history[history.Length - 1].HistoryID; //grab the last history run
}
else
{
// explicitly create history snapshot
historyID = managementProxy.CreateReportHistorySnapshot(reportPath, out warnings);
}
Customizing the connection string to the server where sp will be created?
Niels