Showing posts with label user. Show all posts
Showing posts with label user. Show all posts

Tuesday, March 27, 2012

data fields in header or footer

Hi all,
is there any workaorund for using fields in report header or footer?
Now I user parameters as workaround and I´m not really happy with that...
Thanks,
ToniYou mean dataset fields in the header?
Use reportitem for refering fields in the header.
like this.. =First(ReportItems("EmpName").Value)
Amarnath
"Toni Pohl" wrote:
> Hi all,
> is there any workaorund for using fields in report header or footer?
> Now I user parameters as workaround and Im not really happy with that...
> Thanks,
> Toni
>|||Hi Armanath,
well, this works! (and is a good workaround ;-)
Thanks!!!
Toni
"Amarnath" <Amarnath@.discussions.microsoft.com> schrieb im Newsbeitrag
news:BDAEC70F-7992-4277-89D9-6BCA627BC926@.microsoft.com...
> You mean dataset fields in the header?
> Use reportitem for refering fields in the header.
> like this.. =First(ReportItems("EmpName").Value)
> Amarnath
> "Toni Pohl" wrote:
>> is there any workaorund for using fields in report header or footer?
>> Now I user parameters as workaround and Im not really happy with that...
>> Thanks,sql

Thursday, March 22, 2012

Data Driven Subscriptions Bug with XQuery?

Hello all,

I'm not sure if what I'm encountering is a bug, an intended feature, or user error...

I've got two tables set up to assist with some data driven reports we want to run. There are two tables listed below:

CREATE TABLE [dbo].[MS_REPORT_SUBSCRIPTION](
[ID] [int] NOT NULL,
[REPORT_NAME] [varchar](30) COLLATE SQL_Latin1_General_CP1_CS_AS NOT NULL,[REPORT_DESCRIPTION] [varchar](60) COLLATE SQL_Latin1_General_CP1_CS_AS NOT NULL)

This table holds an instance of an intended data driven subscription report.

CREATE TABLE [dbo].[MS_REPORT_PARAMETERS](
[ID] [int] NOT NULL,
[REPORT_ID] [int] NOT NULL,
[REPORT_PARAMETERS] [ntext] COLLATE SQL_Latin1_General_CP1_CS_AS NOT NULL)

This table will hold the report parameters. The REPORT_PARAMETERS column holds an xml string (formatted the same as in the Reporting Services database. This will allow a single instance to have a variable number of parameters.

This is the query I use to get the information:

SELECT
convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[1]', 'int') SupervisorID,
'\\sxcorp1\temp\dmlenz\is_docs\' + convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[2]', 'varchar(255)') FilePath,
convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[3]', 'varchar(3)') Department
FROM DW_DIMENSION..MS_REPORT_SUBSCRIPTION MRS
JOIN DW_DIMENSION..MS_REPORT_PARAMETERS MRP ON MRS.ID = MRP.REPORT_ID
WHERE MRS.ID = 1

This works great when I run it within Management Studio. The problem happens when I try to use it for my data driven query. I get the error below. I guess it doesn't know how to parse it? I'm not sure why this happens since the database doesn't have a problem with it (or does it - see exception below). This is the error I get in Report Manager:

The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source. (rsCannotPrepareQuery) Get Online Help

For more information about this error navigate to the report server on the local server machine, or enable remote errors

I looked in the logs and found the exception below. Does anyone have an idea as to why this is happening? I do see the 'ARITHABORT' part below, but am not sure if this is something that just comes back from the exception or if it is truly set incorrectly. If the latter is the case, then is this something that Management Studio sets correctly? Anyone have any idea what those settings are? I guess I'm trying to understand why Mgmt Studio would work but not Report Manager..

Thanks in advance for any insight you all might have.

Regards,

Dan

End of inner exception stack trace
w3wp!library!a!06/08/2006-10:17:57:: i INFO: Call to GetSystemPermissions
w3wp!library!d!06/08/2006-10:18:04:: i INFO: Call to GetSystemPermissions
w3wp!library!d!06/08/2006-10:18:04:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.CannotPrepareQueryException: The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source., ;
Info: Microsoft.ReportingServices.Diagnostics.Utilities.CannotPrepareQueryException: The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source. > System.Data.SqlClient.SqlException: SELECT failed because the following SET options have incorrect settings: 'ARITHABORT'. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or query notifications and/or xml data type methods.
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlDataReader.ConsumeMetaData()
at System.Data.SqlClient.SqlDataReader.get_MetaData()
at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)
at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)
at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)
at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)
at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)
at Microsoft.ReportingServices.DataExtensions.SqlCommandWrapperExtension.ExecuteReader(CommandBehavior behavior)
at Microsoft.ReportingServices.Library.SubscriptionManager.PrepareQuery(DataSource dataSource, DataSetDefinition dataSet, ReportParameter[]& parameters, Boolean& changed)
End of inner exception stack trace

This link has some info:

http://msdn2.microsoft.com/en-us/library/ms188783.aspx

I found this bullet point interesting:

The SET option settings must be the same as those required for indexed views and computed column indexes. Specifically, the option ARITHABORT must be set to ON when an XML index is created and when inserting, deleting, or updating values in the xml column. For more information, see SET Options That Affect Results.|||

I've attempted many things up to this point. For some reason setting the options before the data driven query still doesn't get rid of the error message above.

I have no gone as far as to create a table valued function that will allow me to SELECT * FROM fn_function. This parses ok, but when I actually schedule & run the data driven subscription, I get the following error in the subscription status: Error: Cannot read the next data row for the data set . Looks like functions don't work quite right either.

Ryan: That article helped clear up a lot of information. Unfortunately I couldn't seem to get anything working yet. Thanks for the link.

If anyone else has any suggestions, I'd be willing to try anything at this point...

Regards,

Dan

|||

Alright. This is another response to my own question. The good news is, I've figured a way around the limitation.

First I tried this (using the set arithabort on option):

SET ARITHABORT on
SELECT convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[1]', 'int') SupervisorID,
'\\sxcorp1\temp\dmlenz\is_docs\' + convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[2]', 'varchar(255)') FilePath,
convert(xml, MRP.REPORT_PARAMETERS).value('(/ParameterValues/ParameterValue/Value)[3]', 'varchar(3)') Department
FROM DW_DIMENSION..MS_REPORT_SUBSCRIPTION MRS
JOIN DW_DIMENSION..MS_REPORT_PARAMETERS MRP ON MRS.ID = MRP.REPORT_ID
WHERE MRS.ID = 1

It didn't work. The next thing I tried is to create a proc that set the appropriate options. The Data Driven Subscription Engine didn't understand the proc. This was no longer an option.

I stated above that I couldn't get functions to work - this isn't completely true. The reason they didn't work for me was because, in the log files, I was getting the same exception as my original post. At this point, I figured it was a lost cause, however, I was able to set an option before the function sql. Not sure why this worked and the query above didn't.

This is the query that ultimately worked follows:

set ARITHABORT ON

SELECT SM.SupervisorID, '\\sxcorp1\temp\dmlenz\is_docs\zService Scorecards\' + SM.FilePath, SM.FName FileName, SM.Department FROM fn_DataDriven_IS_Service_Metrics(1) SM

I don't understand why setting the option before the xml query didn't work, but I suppose, as long as I got it to work...

I hope this will eventually help someone else.

Regards,

Dan

sql

Data Driven Subscriptions - Recipient List Management

Hi,
We have a need to be able to manage recipient lists for data driven
subscriptions in a user friendly way. (i.e. we need users to be able to edit
these recipient lists, and they are not the type of users that we want to
allow to open enterprise manager !!)
The general plan is to develop some ASP.Net type app that will edit the
table contents.
Before developing something new and possibly reinventing the wheel, I would
just like to know if the Reporting Services team are considering building
some type of recipient list managemtent into future releases of RS ?
Thanks !Not at this time. You will need to write your own application to take care
of this.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Gavin R" <GavinR@.discussions.microsoft.com> wrote in message
news:691AABAD-DE25-4B3E-9B68-9F3A9814AFD2@.microsoft.com...
> Hi,
> We have a need to be able to manage recipient lists for data driven
> subscriptions in a user friendly way. (i.e. we need users to be able to
> edit
> these recipient lists, and they are not the type of users that we want to
> allow to open enterprise manager !!)
> The general plan is to develop some ASP.Net type app that will edit the
> table contents.
> Before developing something new and possibly reinventing the wheel, I
> would
> just like to know if the Reporting Services team are considering building
> some type of recipient list managemtent into future releases of RS ?
> Thanks !

data driven subscription & User id

I have a report that makes use of the global user id parameter. I was going
to set it up as a data driven subscription but as long as I have that
parameter in the report ... the option to schedule that report is not even
allowed. Why is that?Because who is the user when it is scheduled? The userid is the person
requesting the report to run. This makes no sense when scheduling.
The user id is not who is viewing it (which makes sense because you can use
the userid as a query parameter so you have to know who the user is prior to
retrieving the data).
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"MJT" <MJT@.discussions.microsoft.com> wrote in message
news:57010572-B007-473A-8892-113CD05F2CEF@.microsoft.com...
>I have a report that makes use of the global user id parameter. I was
>going
> to set it up as a data driven subscription but as long as I have that
> parameter in the report ... the option to schedule that report is not even
> allowed. Why is that?|||I know that makes sense ... I guess I should have completed my thought. This
report is needed to run both interactively and in a batch mode (which we are
using data driven subscriptions for). So the code that has the user_id parm
is mainly for interactive use. I was wondering if there might be a way to
avoid having to create 2 separate reports ... one for interactive and one for
batch. The interactive uses the user_id to force the initiation of the
report from a portal user interface (which puts an entry into a table when
initiated and contains user_id and what client user is running the report for
- this is handled by security code contained in the UI) That way if the url
is copied and pasted into a browser and someone else tries to run the report
... it will fail because there will not be an entry created by the UI to
match the user_id of the person running the report (unless it is the same
person who just ran it ;-) ) Anyway ... the desire to run this report in
a batch fashion came along later ... and it wont work for obvious reasons.
Is there an easy way to disregard the dataset that reads a table using the
user_id? It seems that just having that parameter in the report disagrees
with reporting services ... it doesnt care if there is logic to disregard it
. I'll have to give this some more thought ... but just wondering if you
have run into other situations where a report needed security for online ...
but wanted to disregard that for a batch run. Thanks,
"Bruce L-C [MVP]" wrote:
> Because who is the user when it is scheduled? The userid is the person
> requesting the report to run. This makes no sense when scheduling.
> The user id is not who is viewing it (which makes sense because you can use
> the userid as a query parameter so you have to know who the user is prior to
> retrieving the data).
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
>
> "MJT" <MJT@.discussions.microsoft.com> wrote in message
> news:57010572-B007-473A-8892-113CD05F2CEF@.microsoft.com...
> >I have a report that makes use of the global user id parameter. I was
> >going
> > to set it up as a data driven subscription but as long as I have that
> > parameter in the report ... the option to schedule that report is not even
> > allowed. Why is that?
>
>|||First off, you say it is a report param? User!UserID is a global variable.
Are you assigning it to a report parameter? You can use it without first
assigning it to a report parameter. I think if you use it as a query
parameter you will need to run a test and find out what value it has when
run as a subscription then use an expression that substitutes whatever makes
sense.
As a test do this: Create a report with no dataset. Just a textbox set to an
expression and the value of the expression set to
=User!UserID.Value
Run the report both interactively and as a subscription and see what you
get. Then wherever you need it use an expression and an if statement.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"MJT" <MJT@.discussions.microsoft.com> wrote in message
news:2921F680-9C88-4BD2-8165-717CE1909428@.microsoft.com...
>I know that makes sense ... I guess I should have completed my thought.
>This
> report is needed to run both interactively and in a batch mode (which we
> are
> using data driven subscriptions for). So the code that has the user_id
> parm
> is mainly for interactive use. I was wondering if there might be a way to
> avoid having to create 2 separate reports ... one for interactive and one
> for
> batch. The interactive uses the user_id to force the initiation of the
> report from a portal user interface (which puts an entry into a table when
> initiated and contains user_id and what client user is running the report
> for
> - this is handled by security code contained in the UI) That way if the
> url
> is copied and pasted into a browser and someone else tries to run the
> report
> ... it will fail because there will not be an entry created by the UI to
> match the user_id of the person running the report (unless it is the same
> person who just ran it ;-) ) Anyway ... the desire to run this report
> in
> a batch fashion came along later ... and it wont work for obvious reasons.
> Is there an easy way to disregard the dataset that reads a table using the
> user_id? It seems that just having that parameter in the report disagrees
> with reporting services ... it doesnt care if there is logic to disregard
> it
> . I'll have to give this some more thought ... but just wondering if you
> have run into other situations where a report needed security for online
> ...
> but wanted to disregard that for a batch run. Thanks,
> "Bruce L-C [MVP]" wrote:
>> Because who is the user when it is scheduled? The userid is the person
>> requesting the report to run. This makes no sense when scheduling.
>> The user id is not who is viewing it (which makes sense because you can
>> use
>> the userid as a query parameter so you have to know who the user is prior
>> to
>> retrieving the data).
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>>
>> "MJT" <MJT@.discussions.microsoft.com> wrote in message
>> news:57010572-B007-473A-8892-113CD05F2CEF@.microsoft.com...
>> >I have a report that makes use of the global user id parameter. I was
>> >going
>> > to set it up as a data driven subscription but as long as I have that
>> > parameter in the report ... the option to schedule that report is not
>> > even
>> > allowed. Why is that?
>>sql

Wednesday, March 7, 2012

Data base structure question

Hi,
I have a data base which hold a record for each contact in exchange
the records contain Status column
I have been asked to allow the user to do a search on both the status
and some other properties of the contact(currently FirstName and LastName)
in order to do that I am going to cache the FirstName and
LastName(searchable properties) of each contact using WebDAV
in my data base.
since in the future the searchable properties can be changed I am thinking
to save them in another table,here is my structure
Table Items
Id Status
1 Imported
Table KeyWords
ItemId Name Value
1 FirstName Julia
1 LastName Adriano
I would like to ask if this structure won't hurt performance espcially since
i am doing paging
using http://rosca.net/writing/articles/serverside_paging.asp sample
It is basically an asp.net intranet application with a single user
BTW:I couldnot uses a distribute query,or store anything in exchange
Thanks in advance
Hi Julia,
This is not really an Access question, but more of an Exchange one, but
since I've done some work in both, I can perhaps get you pointed in the
right direction. Using the Outlook client for Exchange, there are 4 custom
fields which you can use to hold any data you wish. You can also use any
other field which is unlikely to be used. Just remember to document it well.
Exchange identifies every entry of any kind using a 128 character GUID
string which in Exchange is called the EntryID. You can query the MAPI
namespace to get that value. Now, having that value in your database will
automatically define the record as being imported (at least from Exchange).
The structure you have is OK, but it requires user input when it would be
just as easy to get a value from Exchange.
Ask how to implement this in one of the Exchange newsgroups.
Arvin Meyer, MCP, MVP
Microsoft Access
Free Access downloads:
http://www.datastrat.com
http://www.mvps.org/access
"Julia" <codewizard@.012.net.il> wrote in message
news:%23xGJipMdFHA.3076@.TK2MSFTNGP10.phx.gbl...
>
> Hi,
> I have a data base which hold a record for each contact in exchange
> the records contain Status column
> I have been asked to allow the user to do a search on both the status
> and some other properties of the contact(currently FirstName and LastName)
>
> in order to do that I am going to cache the FirstName and
> LastName(searchable properties) of each contact using WebDAV
> in my data base.
> since in the future the searchable properties can be changed I am thinking
> to save them in another table,here is my structure
> Table Items
> Id Status
> 1 Imported
> Table KeyWords
> ItemId Name Value
> 1 FirstName Julia
> 1 LastName Adriano
> I would like to ask if this structure won't hurt performance espcially
since
> i am doing paging
> using http://rosca.net/writing/articles/serverside_paging.asp sample
> It is basically an asp.net intranet application with a single user
> BTW:I couldnot uses a distribute query,or store anything in exchange
> Thanks in advance
>
|||Thanks,I know how to import it from exchange,actually I am using WebDav not
MAPI
I also familiar with the EntryId(i am using the entryid to identify the
contact) and custom fields,but as I wrote I CANNOT
change anything in exchange schema,actually sometimes it is not exchnage
rather other data base
I want to focus on the structure and performance that's all
Thanks,
"Arvin Meyer [MVP]" <a@.m.com> wrote in message
news:OmAlH$MdFHA.1288@.tk2msftngp13.phx.gbl...
> Hi Julia,
> This is not really an Access question, but more of an Exchange one, but
> since I've done some work in both, I can perhaps get you pointed in the
> right direction. Using the Outlook client for Exchange, there are 4 custom
> fields which you can use to hold any data you wish. You can also use any
> other field which is unlikely to be used. Just remember to document it
well.
> Exchange identifies every entry of any kind using a 128 character GUID
> string which in Exchange is called the EntryID. You can query the MAPI
> namespace to get that value. Now, having that value in your database will
> automatically define the record as being imported (at least from
Exchange).[vbcol=seagreen]
> The structure you have is OK, but it requires user input when it would be
> just as easy to get a value from Exchange.
> Ask how to implement this in one of the Exchange newsgroups.
> --
> Arvin Meyer, MCP, MVP
> Microsoft Access
> Free Access downloads:
> http://www.datastrat.com
> http://www.mvps.org/access
> "Julia" <codewizard@.012.net.il> wrote in message
> news:%23xGJipMdFHA.3076@.TK2MSFTNGP10.phx.gbl...
LastName)[vbcol=seagreen]
thinking
> since
>
|||I see nothing wrong with your structure. If you're using ASP and the MSDE
engine wit a client-side cursor, you may as well be using the JET engine. It
is designed as a client-side cursor database engine and in the respect
alone, it's performance is often better that t6he SS engine. The reason is
that Rushmore technology and JET are designed to send only the specific
indexes asked for, then return the data based on a criteria used with the
index(es). Client-side cursors with the SQL engine return the entire
dataset, irrespective of indexes. If you can perform your work with a
server-side cursor, you will usually get better performance with the
SQL-Server engine.
Arvin Meyer, MCP, MVP
Microsoft Access
Free Access downloads:
http://www.datastrat.com
http://www.mvps.org/access
"Julia" <codewizard@.012.net.il> wrote in message
news:eAYEViNdFHA.2076@.TK2MSFTNGP15.phx.gbl...
> Thanks,I know how to import it from exchange,actually I am using WebDav
not[vbcol=seagreen]
> MAPI
> I also familiar with the EntryId(i am using the entryid to identify the
> contact) and custom fields,but as I wrote I CANNOT
> change anything in exchange schema,actually sometimes it is not exchnage
> rather other data base
>
> I want to focus on the structure and performance that's all
> Thanks,
> "Arvin Meyer [MVP]" <a@.m.com> wrote in message
> news:OmAlH$MdFHA.1288@.tk2msftngp13.phx.gbl...
custom[vbcol=seagreen]
> well.
will[vbcol=seagreen]
> Exchange).
be
> LastName)
> thinking
>
|||Ok,Thanks
"Arvin Meyer [MVP]" <a@.m.com> wrote in message
news:e69$7mOdFHA.3156@.tk2msftngp13.phx.gbl...
> I see nothing wrong with your structure. If you're using ASP and the MSDE
> engine wit a client-side cursor, you may as well be using the JET engine.
It[vbcol=seagreen]
> is designed as a client-side cursor database engine and in the respect
> alone, it's performance is often better that t6he SS engine. The reason is
> that Rushmore technology and JET are designed to send only the specific
> indexes asked for, then return the data based on a criteria used with the
> index(es). Client-side cursors with the SQL engine return the entire
> dataset, irrespective of indexes. If you can perform your work with a
> server-side cursor, you will usually get better performance with the
> SQL-Server engine.
> --
> Arvin Meyer, MCP, MVP
> Microsoft Access
> Free Access downloads:
> http://www.datastrat.com
> http://www.mvps.org/access
> "Julia" <codewizard@.012.net.il> wrote in message
> news:eAYEViNdFHA.2076@.TK2MSFTNGP15.phx.gbl...
> not
but[vbcol=seagreen]
the[vbcol=seagreen]
> custom
any[vbcol=seagreen]
> will
> be
status[vbcol=seagreen]
espcially
>

data base in suspend mode

Hi,
In my sql7 the msdb and one user database are both in suspect mode. This is
the state for several hours so I can assume it will not change.
I suspect the user database is causing msdb to get in suspect mode.
However, I am looking for the information regarding the option to force the
bit causing the suspect mode which will then change the mode.
Thanks,
YancoYaniv,shalom
If you have backup of these databases I'd recommend you to restore it.
Also you can reset the status of suspect database by
UPDATE master..sysdatabases SET status = status ^ 256
WHERE name = 'Database_Name'
Note: It's strongly not recommended to update system tables.
"yaniv" <yanive@.nice.com> wrote in message
news:e7DOr#fnDHA.1708@.TK2MSFTNGP12.phx.gbl...
> Hi,
> In my sql7 the msdb and one user database are both in suspect mode. This
is
> the state for several hours so I can assume it will not change.
> I suspect the user database is causing msdb to get in suspect mode.
> However, I am looking for the information regarding the option to force
the
> bit causing the suspect mode which will then change the mode.
>
> Thanks,
> Yanco
>|||Thanks,
I did not have a recent backup of the user database.
I used sp_configure to allow update to system tbls and then sp_resetstatus
and it did the job for both databases, the next time sql srv started the
status was normal.
I will latter look into the log files try understanding the cause to the
problem.
--
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:uUQKyCgnDHA.708@.TK2MSFTNGP10.phx.gbl...
> Yaniv,shalom
> If you have backup of these databases I'd recommend you to restore it.
> Also you can reset the status of suspect database by
> UPDATE master..sysdatabases SET status = status ^ 256
> WHERE name = 'Database_Name'
> Note: It's strongly not recommended to update system tables.
>
> "yaniv" <yanive@.nice.com> wrote in message
> news:e7DOr#fnDHA.1708@.TK2MSFTNGP12.phx.gbl...
> > Hi,
> >
> > In my sql7 the msdb and one user database are both in suspect mode. This
> is
> > the state for several hours so I can assume it will not change.
> >
> > I suspect the user database is causing msdb to get in suspect mode.
> >
> > However, I am looking for the information regarding the option to force
> the
> > bit causing the suspect mode which will then change the mode.
> >
> >
> >
> > Thanks,
> > Yanco
> >
> >
>

Data Available in a Trigger

I am creating a transaction log trigger for a table.

I would like to log the following data

The user login id: SYSTEM_USER

The User's Computer name: ?

The servers time and date: GetDate()

The trigger Action: Update, Delete, or Insert

The unique record ID for the affected table:

One column for the deleted row in xml raw:

One column for the inserted row in xml raw:

Is there a way to get the users computer name?

Consider this:

"select * from inserts for xml raw"

This is nice because i want to store the inserted and/or deleted table information as xml columns. But I would like to handle the transaction log in a way that created one transaction row add per row in the inserted or deleted table.

My end goal is to insert into the transaction log table as follows:

Lets say that my update trigger contains an inserted table with two rows and the deleted table would have the same, two rows.

I would like to insert two rows into my transaction log. with the username, date time, action and one column for the inserted row as xml raw, and one column for the deleted row as xml raw.

Anyone know how to do this?

It would be nice if i could impliment something like this:

'select * as xml raw from inserted' And it ould return as many rows as is in the inserted table, each row having one column wich represents that row in xml format.

'select * from inserted for xml raw' This is not good because it returns one row with one column whose value is an xml representation of the entire inserted table.

Thanx

Jerry Cicierega

The host_name() function will return this information, but it is not always set. Try looking this up in books online.|||

Thank you for responding. Yes this works on my system. The computer name part of my issue is solved.

The xml part of my question still stands.

Thank you !

Saturday, February 25, 2012

Data and log file naming conventions

I have several user databases across several servers.
Some of the physical data files are stored as databasename.mdf,
databasename_data.mdf...
Some of the physical log files are stored as databasename.ldf,
databasename_log.ldf...
Does any one has any naming convention suggestion on this?
Currently SQL2K. Planning to move SQL2005 when it is available.
Thanks,
HarryUse whatever convention is logical and meets your needs; the main thing
is to be consistent.
Personally, I tend to stick with the GUI default <dbname>_data.mdf &
<dbname>_log.ldf for everything I can (using default values for things,
if they're sensible & reasonable, tends to avoid the pain of having to
memorise different conventions). For secondary data files I tend to use
<dbname>n_data.ndf where n is just a simple incrementing integer (1, 2,
3...), eg. mydb_Data.mdf, mydb_Log.ldf, mydb1_Data.ndf, mydb2_Data.ndf, etc.
CREATE DATABASE <dbname>, with no extra parameters, will create the
database with <dbname>.mdf & <dbname>_log.ldf but I've always thought
that to be inconsistent (with "_log" but not "_data") so I've always
gone with the recommendation of the GUI (with the _data & _log postfixes).
*mike hodgson*
blog: http://sqlnerd.blogspot.com
HarrySmith wrote:

>I have several user databases across several servers.
>Some of the physical data files are stored as databasename.mdf,
>databasename_data.mdf...
>Some of the physical log files are stored as databasename.ldf,
>databasename_log.ldf...
>Does any one has any naming convention suggestion on this?
>Currently SQL2K. Planning to move SQL2005 when it is available.
>Thanks,
>Harry
>
>

Data and log file naming conventions

I have several user databases across several servers.
Some of the physical data files are stored as databasename.mdf,
databasename_data.mdf...
Some of the physical log files are stored as databasename.ldf,
databasename_log.ldf...
Does any one has any naming convention suggestion on this?
Currently SQL2K. Planning to move SQL2005 when it is available.
Thanks,
Harry
Use whatever convention is logical and meets your needs; the main thing
is to be consistent.
Personally, I tend to stick with the GUI default <dbname>_data.mdf &
<dbname>_log.ldf for everything I can (using default values for things,
if they're sensible & reasonable, tends to avoid the pain of having to
memorise different conventions). For secondary data files I tend to use
<dbname>n_data.ndf where n is just a simple incrementing integer (1, 2,
3...), eg. mydb_Data.mdf, mydb_Log.ldf, mydb1_Data.ndf, mydb2_Data.ndf, etc.
CREATE DATABASE <dbname>, with no extra parameters, will create the
database with <dbname>.mdf & <dbname>_log.ldf but I've always thought
that to be inconsistent (with "_log" but not "_data") so I've always
gone with the recommendation of the GUI (with the _data & _log postfixes).
*mike hodgson*
blog: http://sqlnerd.blogspot.com
HarrySmith wrote:

>I have several user databases across several servers.
>Some of the physical data files are stored as databasename.mdf,
>databasename_data.mdf...
>Some of the physical log files are stored as databasename.ldf,
>databasename_log.ldf...
>Does any one has any naming convention suggestion on this?
>Currently SQL2K. Planning to move SQL2005 when it is available.
>Thanks,
>Harry
>
>

Data and log file naming conventions

I have several user databases across several servers.
Some of the physical data files are stored as databasename.mdf,
databasename_data.mdf...
Some of the physical log files are stored as databasename.ldf,
databasename_log.ldf...
Does any one has any naming convention suggestion on this?
Currently SQL2K. Planning to move SQL2005 when it is available.
Thanks,
HarryThis is a multi-part message in MIME format.
--020803060805070804090702
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Use whatever convention is logical and meets your needs; the main thing
is to be consistent.
Personally, I tend to stick with the GUI default <dbname>_data.mdf &
<dbname>_log.ldf for everything I can (using default values for things,
if they're sensible & reasonable, tends to avoid the pain of having to
memorise different conventions). For secondary data files I tend to use
<dbname>n_data.ndf where n is just a simple incrementing integer (1, 2,
3...), eg. mydb_Data.mdf, mydb_Log.ldf, mydb1_Data.ndf, mydb2_Data.ndf, etc.
CREATE DATABASE <dbname>, with no extra parameters, will create the
database with <dbname>.mdf & <dbname>_log.ldf but I've always thought
that to be inconsistent (with "_log" but not "_data") so I've always
gone with the recommendation of the GUI (with the _data & _log postfixes).
--
*mike hodgson*
blog: http://sqlnerd.blogspot.com
HarrySmith wrote:
>I have several user databases across several servers.
>Some of the physical data files are stored as databasename.mdf,
>databasename_data.mdf...
>Some of the physical log files are stored as databasename.ldf,
>databasename_log.ldf...
>Does any one has any naming convention suggestion on this?
>Currently SQL2K. Planning to move SQL2005 when it is available.
>Thanks,
>Harry
>
>
--020803060805070804090702
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
</head>
<body bgcolor="#ffffff" text="#000000">
<tt>Use whatever convention is logical and meets your needs; the main
thing is to be consistent.<br>
<br>
Personally, I tend to stick with the GUI default
<dbname>_data.mdf & <dbname>_log.ldf for everything I
can (using default values for things, if they're sensible &
reasonable, tends to avoid the pain of having to memorise different
conventions). For secondary data files I tend to use
<dbname>n_data.ndf where n is just a simple incrementing integer
(1, 2, 3...), eg. mydb_Data.mdf, mydb_Log.ldf, mydb1_Data.ndf,
mydb2_Data.ndf, etc.<br>
<br>
CREATE DATABASE <dbname>, with no extra parameters, will create
the database with <dbname>.mdf & <dbname>_log.ldf but
I've always thought that to be inconsistent (with "_log" but not
"_data") so I've always gone with the recommendation of the GUI (with
the _data & _log postfixes).<br>
</tt>
<div class="moz-signature">
<title></title>
<meta http-equiv="Content-Type" content="text/html; ">
<p><span lang="en-au"><font face="Tahoma" size="2">--<br>
</font></span> <b><span lang="en-au"><font face="Tahoma" size="2">mike
hodgson</font></span></b><span lang="en-au"><br>
<font face="Tahoma" size="2">blog:</font><font face="Tahoma" size="2"> <a
href="http://links.10026.com/?link=http://sqlnerd.blogspot.com</a></font></span>">http://sqlnerd.blogspot.com">http://sqlnerd.blogspot.com</a></font></span>
</p>
</div>
<br>
<br>
HarrySmith wrote:
<blockquote cite="midONj5bFAtFHA.3604@.tk2msftngp13.phx.gbl" type="cite">
<pre wrap="">I have several user databases across several servers.
Some of the physical data files are stored as databasename.mdf,
databasename_data.mdf...
Some of the physical log files are stored as databasename.ldf,
databasename_log.ldf...
Does any one has any naming convention suggestion on this?
Currently SQL2K. Planning to move SQL2005 when it is available.
Thanks,
Harry
</pre>
</blockquote>
</body>
</html>
--020803060805070804090702--

Tuesday, February 14, 2012

Customizing the report scheduling...

one of the requirements is that if the report takes a long time to run the user can start report processing and later
when that report is processed user can see that in his 'My reports' section,

What will be the best way to do this, my main concerns are -

1 How can I use scheduling to accomplish this.

2 How to save the report, as snappshot or what...

3. How can I find in my Web app. that report is ready or not and show a link to it.

There isn't an easy way to know how long a report will take to run if this is what you after. Other than that:

1. Sure but scheduled reports have limitations. Since they run in an unattended mode, they cannot use User!UserID and all parameters must have defaults.

2. I would see if subscribed delivery works for your users. In fact, your users can create their own subscriptions if they have the necessary rights.

3. Subscribed reports can be delivered via e-mail. You don't need to do anything. The user will automatically receive the report via e-mail.

|||

Hi Teo,

Thanks for the reply,

1 What if we are not using the User!userId in report and all other parameters are passed while creating the schedule?

one more thing is while creating the schedule using web service (CreateSchedule) we can send a link of the report in email(without attaching the report);to what that link points to and cant we use that with ReportViewer control to show report to user.

2. I think after the schedule runs and finish processing report report service would be updating some table to show that this schedule is completed or not.

3. Just got an idea that, can i save the report in an ftp location through schedule and pick the report from there to show to user?

hope that makes sense....

|||

1. The parameters needs to be set when creating the subscription not schedule. A schedule can be shared among subscriptions. As I mentioned, if you decide to use subcribed delivery, it will be the end user who will set the parameters. When the user clicks on the link, the report will be open in the ASP.NET ReportViewer control. In the case of a custom application, we address a similar requirement by having a custom job controller which would generate the report on the server as a snapshot and send the history ID to the application once the report is ready.

2. Again, there is a difference between a schedule and subscription (if this is what you would use). That said, you can use GetScheduleProperties API to get the last time the schedule is run.

3. A subscribed report can be delivered to a network share. Start creating a subscription and you will see that you have a choice between e-mail and network share delivery.

|||

Teo Lachev wrote:

When the user clicks on the link, the report will be open in the ASP.NET ReportViewer control.

Did'nt got which link you are talking about...one sent in email? how come it will be opened in reportviewer?

Teo Lachev wrote:

In the case of a custom application, we address a similar requirement by having a custom job controller which would generate the report on the server as a snapshot and send the history ID to the application once the report is ready.

I think thats what I am looking for, can you provide some links for this or some sample app.

And by using CreateSubscription API i can set the schedule also or will need something else for that?

|||

Did'nt got which link you are talking about...one sent in email? how come it will be opened in reportviewer?

When you set up an e-mail subscription you can configure it to send the report link instead of the entire report. The report will open up in the ASP.NET report viewer. Try it out http://localhost/reportserver?/AdventureWorks%20Sample%20Reports/Company%20Sales&rs:Command=Render

I think thats what I am looking for, can you provide some links for this or some sample app.

In our case, we don't use subscriptions. Instead, the job service generates the report as a snapshot

if (parameters != null) // set report parameters {

managementProxy.SetReportParameters(reportPath, parameters);

}

// Create the report snapshot so the user can browse the report

managementProxy.UpdateReportExecutionSnapshot(reportPath);

// Check if the report is configured to keep snapshots in history

bool keepExecutionShapshots = false;

bool result = managementProxy.GetReportHistoryOptions(reportPath, out keepExecutionShapshots, out schedule);

if (keepExecutionShapshots)

{

// history is automatically created, get the list of history runs

ReportHistorySnapshot[] history = managementProxy.ListReportHistory(reportPath);

Array.Sort(history, CompareReportHistoryByDate); // need to sorty by date since history runs may not be chronologically sorted

historyID = history[history.Length - 1].HistoryID; //grab the last history run

}

else

{

// explicitly create history snapshot

historyID = managementProxy.CreateReportHistorySnapshot(reportPath, out warnings);

}

Customizing Security

Hello,
I was wondering if its possible to have database user authenticate
against an external database or directory server? The client has a
centralized repository system that they would like all systems to
authenticate against. If this is feasible, can offer any suggestions
on how we can achieve this?
Thanks in advance
TomHi Tom,
Are you asking about central authentication using SQL Security or
Windows Authentication?
Thanks,
Kevin McDonnell
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.|||I have a similar situation and need to plan for centralized security around
Windows/AD. Furthermore I need to leverage the AD groups/roles in the
architecture. ANy insite or direction would me appriciated.
"Kevin McDonnell [MSFT]" wrote:

> Hi Tom,
> Are you asking about central authentication using SQL Security or
> Windows Authentication?
> Thanks,
> Kevin McDonnell
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>
>|||Previous post:
I have a similar situation and need to plan for centralized security around
Windows/AD. Furthermore I need to leverage the AD groups/roles in the
architecture. ANy insite or direction would me appriciated.
Reply;
We use Domain based groups or local groups to base our security on. We can
publish objects to
Active Directory, but our security is not tied to AD like Exchange 2000 for
example. We can authenticate via
Trusted Connections using NTLM or Kerberos. Kerberos based authentication
requires that the Server Principal Name is
set for SQL. See Books Online for the example.
Our Security Model is discussed here:
http://www.microsoft.com/technet/pr...n/sp3sec01.mspx
Thanks,
Kevin McDonnell
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.

customizing reporting services error

Is there any way to customize reporting services error and provide a user friendly message? We use Reporting Services 2000 and the error message is:

Reporting Services Error


An error has occurred during report processing. (rsProcessingAborted) Get Online Help

Query execution failed for data set 'dsasSource'. (rsErrorExecutingCommand) Get Online Help

No.

Jens K. Suessmeyer.

http://www.sqlserver2005.de