19 Şubat 2011 Cumartesi

Updated version for Schedule Maintenance Mode Management Pack Guide on OpsManJam

This management pack provides an automated method to schedule maintenance mode for a given set of Windows computers that are in a pre-defined set of groups and the associated rules to target against those groups. In addition, this version supports scheduling maintenance mode for Windows Cluster services.   It serves as an example and foundation to build from in order to develop your own custom schedule to automate within Operations Manager 2007 R2 accordingly.


Download OpsManJam

Usefull Link:
http://www.opsmanjam.com/OpsManJam%20Library/Forms/MP%20folder%20view.aspx?RootFolder=/OpsManJam%20Library/Management%20Packs&FolderCTID=&View={28F6035C-64EB-43DD-AE64-039B5C85A626}

User Logon Time Report for Operations Manager 2007


A customer was asking for some information about the time it took for users to logon on their Terminal Server hosted Windows Desktops. Luckily there were two eventID’s created during the logon process.  And if you calculate the time between those eventID’s then you have some insight in the Logon time for a user.
When a user logs on, Security Event 528 is created. Event 528 is logged whenever an account logs on to the local computer, except for in the event of network logons (see event 540). Event 528 is logged whether the account used for logon is a local SAM account or a domain account. On a Windows 7 client Security event 4624 is created.
Example Screenshot:
And then they used Event ID 865 — Software Restriction Policy Notification for the last Log on event. So the Log on Time could be calculated the time difference between those eventID’s. You should look for your own eventID if you want to create a simular report.
So how can these Events be used to create a Logon Time Report?
High Level steps:
  1. Create two NT Event Collection Rules for each EventID (528 and 865) in OpsMgr.
  2. Create an SQL Query which calculates the Time difference between the two EventID’s collected by the Event Collection Rules for each user.
  3. Create a Custom Logon Time OpsMgr Report with Visual Studio.
Let’s start with a more detailed overview of the steps.
For testing purposes I created a batch script, which you can use to create two dummy eventID’s in the Application Event Log. You can run this script using different credentials (users) with the SysInternals tool ShellRunAs.
@echo off
REM First EventID 528 LastEventID 865
Eventcreate /L Application /D "First EventID created" /T INFORMATION /ID 528
REM Create Random number between 1 -30 secs.
Set max=30
Set min=1
:: Generate number between Min and Max
Set /A rand=%random% %% (max - min + 1)+ min
REM Wait for 1-30 secs to generate second EventID
sleep %rand%
Eventcreate /L Application /D "Last EventID created" /T INFORMATION /ID 865
The above script uses the sleep.exe command so you need to install it from the Reskit if you haven’t installed that yet. As you can see the time between the two events is a random number between 1 – 30 secs.
Save above code to LogOndemo.cmd and run from command prompt.




Create two NT Event Collection Rules for each EventID (528 and 865) in OpsMgr
Ok now we have the two dummy events are created we can create the Event Collection Rules. Make sure you created a new MP to store your Event Collection Rules.


Disable Rule we enable the rules later for the servers we want to Collection Rules to be running on.


Keep in mind these example Collection Rules are created for the dummy eventID’s. You should select your own EventID’s.
Enable Rules for server(s) you need to have to Collection Rule running on.
For EventID 865 follow the same steps as shown above. Don’t forget to enable the Rule with an Override!

Create an SQL Query which calculates the Time difference between the two EventID’s collected by the Event Collection Rules for each user.
And now the most difficult part of creating a Custom Report, creating the right Database SQL queries. I got some help from some SQL guru’s at the office, so you may want to talk to your local dba for some pointers about creating the right SQL queries.
First we need to determine what we want to show in our Reports. Right? The customer wanted to see the Minimum, Maximum and Average LogOn time per user on a server. Something like this:
But first some steps I took to create the SQL queries. First I wanted to collect all the 528 and 865 events collected by the two Collection Rules (FirstLogOn Event 528 and LastLogOn Event 865). And I used the next SQL query from Anders Bengtsson for that and changed that a little. It collects all the Collected Events 528 and 865 within the selected period ('2009-12-01 00:00:00.000' and '2009-12-12 00:00:00.000')
DECLARE @StartDate datetime
DECLARE @EndDate   datetime
SET @StartDate = '2009-12-01 00:00:00.000'
SET @EndDate = '2009-12-12 00:00:00.000';
USE OPERATIONSMANAGERDW
SELECT
    vEvent.DateTime,
    vEventPublisher.EventPublisherName as 'EventSource',
    vEventLoggingComputer.ComputerName as 'Computer',
    vEventLevel.EventLevelTitle as 'Type',
    vEvent.EventDisplayNumber as 'EventID',
    vEventChannel.EventChannelTitle,
    vEventUserName.UserName,
    vEventDetail.RenderedDescription as 'EventDescription'
FROM Event.vEvent LEFT OUTER JOIN
    vEventUserName ON vEvent.UserNameRowId =
    vEventUserName.EventUserNameRowId LEFT OUTER JOIN
    vEventCategory ON vEvent.EventCategoryRowId =
    vEventCategory.EventCategoryRowId LEFT OUTER JOIN
    vEventPublisher ON vEvent.EventPublisherRowId =
    vEventPublisher.EventPublisherRowId LEFT OUTER JOIN
    vEventLoggingComputer ON vEvent.LoggingComputerRowId =
    vEventLoggingComputer.EventLoggingComputerRowId LEFT OUTER JOIN
    vEventLevel ON vEvent.EventLevelId = vEventLevel.EventLevelId LEFT OUTER JOIN
    vEventChannel ON vEvent.EventChannelRowId =
    vEventChannel.EventChannelRowId LEFT OUTER JOIN
    Event.vEventDetail ON vEvent.EventOriginId = vEventDetail.EventOriginId
WHERE vEventLevel.EventLevelTitle = 'Information'
AND vEvent.Datetime between @StartDate and @EndDate
AND (vEvent.EventDisplayNumber = 528
OR vEvent.EventDisplayNumber = 865)   
Result Screenshot:



It’s a start but not completely what we want ;-)
As you can see to LogOn time for user OpsMgrDemo\OM_Admin is 22 seconds.
So which query do we need to create to calculate the time difference between those two events per UserName?
DECLARE @StartDate datetime
DECLARE @EndDate   datetime
SET @StartDate = '2009-12-01 00:00:00.000'
SET @EndDate = '2009-12-12 00:00:00.000';
WITH CTE AS
    (SELECT
    ROW_NUMBER() OVER(ORDER BY vEvent.DateTime) AS RowNum,
    vEvent.DateTime,
    vEventPublisher.EventPublisherName as 'EventSource',
    vEventLoggingComputer.ComputerName as 'Computer',
    vEventLevel.EventLevelTitle as 'Type',
    vEvent.EventDisplayNumber as 'EventID',
    vEventChannel.EventChannelTitle,
    vEventUserName.UserName,
    vEventDetail.RenderedDescription as 'EventDescription'
    FROM
    Event.vEvent LEFT OUTER JOIN
    vEventUserName ON vEvent.UserNameRowId =
    vEventUserName.EventUserNameRowId LEFT OUTER JOIN
    vEventCategory ON vEvent.EventCategoryRowId =
    vEventCategory.EventCategoryRowId LEFT OUTER JOIN
    vEventPublisher ON vEvent.EventPublisherRowId =
    vEventPublisher.EventPublisherRowId LEFT OUTER JOIN
    vEventLoggingComputer ON vEvent.LoggingComputerRowId =
    vEventLoggingComputer.EventLoggingComputerRowId LEFT OUTER JOIN
    vEventLevel ON vEvent.EventLevelId = vEventLevel.EventLevelId LEFT OUTER JOIN
    vEventChannel ON vEvent.EventChannelRowId =
    vEventChannel.EventChannelRowId LEFT OUTER JOIN
    Event.vEventDetail ON vEvent.EventOriginId = vEventDetail.EventOriginId
    WHERE vEventLevel.EventLevelTitle = 'Information'
    AND vEvent.Datetime between @StartDate and @EndDate
    AND (vEvent.EventDisplayNumber = 528
    OR vEvent.EventDisplayNumber = 865)     
)
SELECT *
, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) AS LogOffTime
, DATEDIFF(s, CTE.DateTime, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) ) AS LogTime
FROM CTE
WHERE RowNum%2 = 1
Result Screenshot:
As you can see, per UserName is calculated what the LogOn Time is. Again you see that it took 22 seconds for the LogOn for User OM_Admin.
You could already start using this SQL query as Database SQL Query in your Custom OpsMgr Report in Visual Studio. You could for instance, Group the data on Computer or User to generate a Report on LogOn time for users on a specific Computer or User.
Example Screenshot:
But we wanted to create a Report with MIN, MAX and AVG LogOn time per user, right?
For that we need the next SQL query.
DECLARE @StartDate datetime
DECLARE @EndDate   datetime
SET @StartDate = '2009-12-01 00:00:00.000'
SET @EndDate = '2009-12-12 00:00:00.000';
-- Create TEMP table
--(RowNum, DateTime, EventSource, Computer,Type,EventID, EventChannelTitle, UserName, EventDescription, LoggOffTime, LogTime)
DECLARE @temp TABLE(
    RowNum INTEGER,
    DateTime DATETIME,
    EventSource NVARCHAR(250),
    Computer NVARCHAR(250),
    Type NVARCHAR(250),
    EventID INTEGER,
    EventChannelTitle NVARCHAR(250),
    UserName NVARCHAR(250),
    EventDescription NVARCHAR(250),
    LogOffTime DATETIME,
    LogTime INTEGER );
WITH CTE AS
    (SELECT 
    ROW_NUMBER() OVER(ORDER BY vEvent.DateTime) AS RowNum,
    vEvent.DateTime,
    vEventPublisher.EventPublisherName as 'EventSource',
    vEventLoggingComputer.ComputerName as 'Computer',
    vEventLevel.EventLevelTitle as 'Type',
    vEvent.EventDisplayNumber as 'EventID',
    vEventChannel.EventChannelTitle,
    vEventUserName.UserName,
    vEventDetail.RenderedDescription as 'EventDescription'
    FROM
    Event.vEvent LEFT OUTER JOIN
    vEventUserName ON vEvent.UserNameRowId =
    vEventUserName.EventUserNameRowId LEFT OUTER JOIN
    vEventCategory ON vEvent.EventCategoryRowId =
    vEventCategory.EventCategoryRowId LEFT OUTER JOIN
    vEventPublisher ON vEvent.EventPublisherRowId =
    vEventPublisher.EventPublisherRowId LEFT OUTER JOIN
    vEventLoggingComputer ON vEvent.LoggingComputerRowId =
    vEventLoggingComputer.EventLoggingComputerRowId LEFT OUTER JOIN
    vEventLevel ON vEvent.EventLevelId = vEventLevel.EventLevelId LEFT OUTER JOIN
    vEventChannel ON vEvent.EventChannelRowId =
    vEventChannel.EventChannelRowId LEFT OUTER JOIN
    Event.vEventDetail ON vEvent.EventOriginId = vEventDetail.EventOriginId
    WHERE vEventLevel.EventLevelTitle = 'Information'
    AND vEvent.Datetime between @StartDate and @EndDate
    AND (vEvent.EventDisplayNumber = 528
    OR vEvent.EventDisplayNumber = 865)
)
INSERT INTO @temp(RowNum, DateTime, EventSource, Computer,Type,EventID, EventChannelTitle, UserName, EventDescription, LogOffTime, LogTime)
SELECT *
, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) AS LogOffTime
, DATEDIFF(s, CTE.DateTime, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) ) AS LogTime
FROM CTE
WHERE RowNum%2 = 1
SELECT
    Computer,
    UserName,
    COUNT(t1.LogTime) AS [NUMBEROFLOGINS],
    MAX(t1.LogTime) AS [MAXLOGTIME],
    MIN(t1.LogTime) AS [MINLOGTIME],
    AVG(t1.LogTime) AS [AVGLOGTIME]
    FROM @temp t1
    GROUP BY Computer,UserName
Screenshot Result:
Yes! This is what we were looking for. Now we have the right Data SQL query, we can Open Visual Studio to create our Custom User LogOn Time Report.
Create a Custom Logon Time OpsMgr Report with Visual Studio
Let’s open SQL Server Business Intelligence Development Studio.
Create a New Project and select Report Server Project Wizard and give your project a name.
Select Next in the Welcome to the Report Wizard Screen
Create a Data Source

Click Edit button to enter the SQL Server information.

And Test Connection if you want.
Click Next in the Report Wizard Screen.
Enter the previously created SQL query with some changes (remove the Declare statements).
DECLARE @temp TABLE(
    RowNum INTEGER,
    DateTime DATETIME,
    EventSource NVARCHAR(250),
    Computer NVARCHAR(250),
    Type NVARCHAR(250),
    EventID INTEGER,
    EventChannelTitle NVARCHAR(250),
    UserName NVARCHAR(250),
    EventDescription NVARCHAR(250),
    LogOffTime DATETIME,
    LogTime INTEGER );
WITH CTE AS
    (SELECT 
    ROW_NUMBER() OVER(ORDER BY vEvent.DateTime) AS RowNum,
    vEvent.DateTime,
    vEventPublisher.EventPublisherName as 'EventSource',
    vEventLoggingComputer.ComputerName as 'Computer',
    vEventLevel.EventLevelTitle as 'Type',
    vEvent.EventDisplayNumber as 'EventID',
    vEventChannel.EventChannelTitle,
    vEventUserName.UserName,
    vEventDetail.RenderedDescription as 'EventDescription'
    FROM
    Event.vEvent LEFT OUTER JOIN
    vEventUserName ON vEvent.UserNameRowId =
    vEventUserName.EventUserNameRowId LEFT OUTER JOIN
    vEventCategory ON vEvent.EventCategoryRowId =
    vEventCategory.EventCategoryRowId LEFT OUTER JOIN
    vEventPublisher ON vEvent.EventPublisherRowId =
    vEventPublisher.EventPublisherRowId LEFT OUTER JOIN
    vEventLoggingComputer ON vEvent.LoggingComputerRowId =
    vEventLoggingComputer.EventLoggingComputerRowId LEFT OUTER JOIN
    vEventLevel ON vEvent.EventLevelId = vEventLevel.EventLevelId LEFT OUTER JOIN
    vEventChannel ON vEvent.EventChannelRowId =
    vEventChannel.EventChannelRowId LEFT OUTER JOIN
    Event.vEventDetail ON vEvent.EventOriginId = vEventDetail.EventOriginId
    WHERE vEventLevel.EventLevelTitle = 'Information'
    AND vEvent.Datetime between @StartDate and @EndDate
    AND (vEvent.EventDisplayNumber = 528
    OR vEvent.EventDisplayNumber = 865)
)
INSERT INTO @temp(RowNum, DateTime, EventSource, Computer,Type,EventID, EventChannelTitle, UserName, EventDescription, LogOffTime, LogTime)
SELECT *
, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) AS LogOffTime
, DATEDIFF(s, CTE.DateTime, (SELECT T2.DateTime
    FROM CTE AS T2
    WHERE (CTE.RowNum + 1)= T2.RowNum) ) AS LogTime
FROM CTE
WHERE RowNum%2 = 1
SELECT
    Computer,
    UserName,
    COUNT(t1.LogTime) AS [NUMBEROFLOGINS],
    MAX(t1.LogTime) AS [MAXLOGTIME],
    MIN(t1.LogTime) AS [MINLOGTIME],
    AVG(t1.LogTime) AS [AVGLOGTIME]
    FROM @temp t1
    GROUP BY Computer,UserName
Copy and paste above (or your own query) to the Query Builder window and click on Next.


Select your Report Type and click on Next
Select how you want to group your report. (I just kept the default settings)
Select the Table Style of the Report and click on Next
Enter Report server and Deployment folder info and click on Next.
Give your Report a name and select Preview Report and click on Finish
Change Report Parameters from Text to Date/Time Date Type.
You need to change this for both parameters!
Let’s check the Report. Go to Preview, select the Start Date and End Date and click on View Report.
Ok it’s not exactly the way we would like it to be, but the results are there!
Let’s Pimp this Report a little.
So this could be the end result. And you can pimp it even more if you want…
------------------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------------------

528: Successful Logon
On this page
Event 528 is logged whenever an account logs on to the local computer, except for in the event of network logons (see event 540). Event 528 is logged whether the account used for logon is a local SAM account or a domain account.
Logon types possible:
Logon Type
Description
2
 Interactive (logon at keyboard and screen of system) Windows 2000 records Terminal Services logon as this type rather than Type 10.
3
Network (i.e. connection to shared folder on this computer from elsewhere on network or IIS logon - Never logged by 528 on W2k and forward. See event540)
4
Batch (i.e. scheduled task)
5
Service (Service startup)
7
Unlock (i.e. unnattended workstation with password protected screen saver)
8
NetworkCleartext (Logon with credentials sent in the clear text. Most often indicates a logon to IIS with "basic authentication") See this article for more information.
9
NewCredentials
10
RemoteInteractive (Terminal Services, Remote Desktop or Remote Assistance)
11
CachedInteractive (logon with cached domain credentials such as when logging on to a laptop when away from the network)
For an explanation of the Logon Process field, see event 515. For an explanation of the Authentication Package field, see event 514.
Logon GUID is not documented. It is unclear what purpose the Caller User Name, Caller Process ID, and Transited Services fields serve.

Source Network Address corresponds to the IP address of the Workstation Name. Source Port is the TCP port of the workstation and has dubious value.
Logon ID is useful for correlating to many other events that occurr during this logon session.
  • User Name:
  • Domain: 
  • Logon ID: useful for correlating to many other events that occurr during this logon session
  • Logon Type: %4
  • Logon Process: %5
  • Authentication Package: %6
  • Workstation Name: %7
The following field is not logged in Window 2000:
  • Logon GUID
The following fields are not logged in Windows 2000 or XP:
  • Caller User Name:
  • Caller Domain:
  • Caller Logon ID:
  • Caller Process ID:
  • Transited Services:
  • Source Network Address:
  • Source Port:

Successful Logon:

User Name:administrator

Domain:ELM
Logon ID:(0x0,0x558DD)
Logon Type:2
Logon Process:User32
Authentication Package:Negotiate
Workstation Name:W2MS
Windows XP and Windows Server 2003 add:
Logon GUID:{d39697e4-34a9-b3e0-f30a-d2ba517eb4a2}
Windows Server 2003 adds these fields:
Caller User Name:-
Caller Domain:-
Caller Logon ID:-
Caller Process ID: -
Transited Services: -
Source Network Address:10.42.42.170
Source Port:3165
----------------------------------------------------------------
----------------------------------------------------------------
540: Successful Network Logon
On this page
Event 540 gets logged when a user elsewhere on the network connects to a resource (e.g. shared folder) provided by the Server service on this computer. The Logon Type will always be 3 or 8, both of which indicate a network logon.
Logon type 3 is what you normally see.  Logon Type 8 means network logon with clear text authentication.  The only scenario where we've observed logon type 8 is with logons to IIS web-sites via Basic Authentication.  Don't immediately sound the alarms if you see logon type 8 since most Basic Authentication is wrapped up inside an SSL session via https.
For all other logon types see event 528.

Event 540 gets logged whether the account used for logon is a local SAM account or a domain account. For all other types of logons this event is logged including

For an explanation of logon processes see event 515. For an explanation of authentication package see event 514.

Logon GUID is not documented. It is not clear what the caller user, caller process ID, transited services are about.

Source Network Address corresponds to the IP address of the Workstation Name. Source Port is the TCP port of the workstation and has dubious value.
  • User Name: %1
  • Domain: %2
  • Logon ID: %3
  • Logon Type: %4
  • Logon Process: %5
  • Authentication Package: %6
  • Workstation Name: %7
The following field is not logged in Window 2000:
  • Logon GUID
The following fields are not logged in Windows 2000 or XP:
  • Caller User Name:
  • Caller Domain:
  • Caller Logon ID:
  • Caller Process ID:
  • Transited Services:
  • Source Network Address:
  • Source Port:

Successful Network Logon

User Name: %1
Domain: %2
Logon ID: %3
Logon Type: %4
Logon Process: %5
Authentication Package: %6
Workstation Name: %7

Windows XP and Windows Server 2003 add:

Logon GUID:{d39697e4-34a9-b3e0-f30a-d2ba517eb4a2}
Windows Server 2003 adds these fields:
Caller User Name:-
Caller Domain:-
Caller Logon ID:-
Caller Process ID: -
Transited Services: -
Source Network Address:10.42.42.170
Source Port:3165

----------------------------------------------------------------
----------------------------------------------------------------
4624: An account was successfully logged on
On this page
This is a highly valuable event since it documents each and every successful attempt to logon to the local computer regardless of logon type, location of the user or type of account.

Subject:

Identifies the account that requested the logon - NOT the user who just logged on.  Subject is usually Null or one of the Service principals and not usually useful information.  See New Logon for who just logged on to the sytem.

Logon Type:

This is a valuable piece of information as it tells you HOW the user just logged on:
Logon TypeDescription
2Interactive (logon at keyboard and screen of system)
3Network (i.e. connection to shared folder on this computer from elsewhere on network)
4Batch (i.e. scheduled task)
5Service (Service startup)
7Unlock (i.e. unnattended workstation with password protected screen saver)
8NetworkCleartext (Logon with credentials sent in the clear text. Most often indicates a logon to IIS with "basic authentication") See this article for more information.
9NewCredentials such as with RunAs or mapping a network drive with alternate credentials.  This logon type does not seem to show up in any events.  If you want to track users attempting to logon with alternate credentials see 4648.
10RemoteInteractive (Terminal Services, Remote Desktop or Remote Assistance)
11CachedInteractive (logon with cached domain credentials such as when logging on to a laptop when away from the network)

New Logon:

The user who just logged on is identified by the Account Name and Account Domain.  You can determine whether the account is local or domain by comparing the Account Domain to the computer name.  If they match, the account is a local account on that system, otherwise a domain account. Security ID: the SID of the accountAccount Name: Logon name of the accountAccoutn Domain: Domain name of the account (pre-Win2k domain name)Logon ID is a semi-unique (unique between reboots) number that identifies the logon session just initiated.  Any events logged subsequently during this logon session will report the same Logon ID through to the logoff event 4647 or 4634.Logon GUID: Supposedly you should be able to correlate logon events on this computer with corresonding authentication events on the domain controller using this GUID.  Such as linking 4624 on the member computer to 4769 on the DC.  But the GUIDs do not match between logon events on member computers and the authentication events on the domain controller.

Process Information:

The Process Name identifies the program executable that processed the logon.  This is one of the trusted logon processes identified by 4611. Process ID is the process ID specified when the executable started as logged in 4688.

Network Information:

This section identifies WHERE the user was when he logged on.  Of course if logon is initiated from the same computer this information will either be blank or reflect the same local computers. Workstation Name: is the computer name of the computer where the user is physically present in most cases unless this logon was intitiated by a server application acting on behalf of the user.  Workstation may also not be filled in for some Kerberos logons since the Kerberos protocol doesn't really care about the computer account in the case of user logons and therefore lacks any field for carrying workstation name in the ticket request message.Source Network Address is the IP address of the computer where the user is physically present in most cases unless this logon was intitiated by a server application acting on behalf of the user.  If this logon is initiated locally the IP address will sometimes be 127.0.0.1 instead of the local computer's actual IP address.  This field is also blank sometimes because Microsoft says "Not every code path in Windows Server 2003 is instrumented for IP address, so it's not always filled out."Source Port identifies the source TCP port of the logon request which seems useless since with most protocols source ports are random.

Detailed Authentication Information:

  • Logon Process (see 4611)
  • CredPro indicates a logon initiated by User Account Control
  • Authentication Package (see 4610 or 4622)
  • Transited Services - This has to do with server applications that need to accept some other type of authentication from the client and then transition to Kerberos for accessing other resources on behalf of the client.  Seehttp://msdn.microsoft.com/msdnmag/issues/03/04/SecurityBriefs/
  • Package name - If this logon was authenticated via the NTLM protocol (instead of Kerberos for instance) this field tells you which version of NTLM was used.  See security option "Network security: LAN Manager authentication level"
  • Key Length - Length of key protecting the "secure channel".  See security option "Domain Member: Require strong (Windows 2000 or later) session key".  If value is 0 this would indicate security option "Domain Member: Digitally encrypt secure channel data (when possible)" failed.

An account was successfully logged on.
Subject:
   
Security ID:  SYSTEM
   Account Name:  WIN-R9H529RIO4Y$
   Account Domain:  WORKGROUP
   Logon ID:  0x3e7

Logon Type:10
New Logon:
      Security ID:  WIN-R9H529RIO4Y\Administrator
   Account Name:  Administrator
   Account Domain:  WIN-R9H529RIO4Y
   Logon ID:  0x19f4c
   Logon GUID:  {00000000-0000-0000-0000-000000000000}

Process Information:   Process ID:  0x4c0
   Process Name:  C:\Windows\System32\winlogon.exe

Network Information:     Workstation Name: WIN-R9H529RIO4Y
   Source Network Address: 10.42.42.211
   Source Port:  1181
Detailed Authentication Information:     Logon Process:  User32
   Authentication Package: Negotiate
   Transited Services: -
   Package Name (NTLM only): -
   Key Length:  0


This event is generated when a logon session is created. It is generated on the computer that was accessed.
The subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.
The logon type field indicates the kind of logon that occurred. The most common types are 2 (interactive) and 3 (network).
The New Logon fields indicate the account for whom the new logon was created, i.e. the account that was logged on.
The network fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.
The authentication information fields provide detailed information about this specific logon request.
The authentication information fields provide detailed information about this specific logon request.
  • Logon GUID is a unique identifier that can be used to correlate this event with a KDC event.
  • Transited services indicate which intermediate services have participated in this logon request.
  • Package name indicates which sub-protocol was used among the NTLM protocols.
  • Key length indicates the length of the generated session key. This will be 0 if no session key was requested.



OpsMgr Custom Reporting Links



Performance Reports – The easy way:
http://blogs.technet.com/b/jonathanalmquist/archive/2009/02/21/performance-reports-the-easy-way.aspx

Custom performance report with threshold line:

Performance Reports and Groups:
http://contoso.se/blog/?p=629

Blog: How to create a Linked Performance Report in the Authoring Console - Part 1: The XML Parameter Block: 
Blog: Linked Reporting Authoring Part 2: Authoring a Linked Report in the OpsMgr Authoring Console: 
Forecasting Reporting Scenarios – More Samples: 
Blog: Creating Useful Custom Reports in OpsMgr: How to schedule my custom report for delivery: 
Creating Useful Custom Reports in OpsMgr: How to make my custom report publicly available within my organization: 
Blog: Creating Useful Custom Reports in OpsMgr: How to create a custom performance counter report for a group of servers: 
Blog: Creating Useful Custom Reports in OpsMgr: How to create a processor utilization report for a group of servers: 
Blog: Creating Useful Custom Reports in OpsMgr: How to create a simple free disk space report: 
Blog: Creating Useful Custom Reports in OpsMgr: Gathering Custom Performance Counters: 
Blog: QuickTricks: Creating really easy multiple server performance reports & how to create a report for multiple objects when you don't know what object(s) to choose: 
Pimp up My Reports: How to beef up the Operating System Configuration Report: 
Blog: How do I know what objects to choose when building a custom report?: 
SCOM: Deliver Reports to Sharepoint:
SCOM reports on performance counters for large groups of servers: 
Custom Report Authoring for Beginners: 
Tip on Custom OpsMgr Report Parameters: 
Adding Report Details to your Custom OpsMgr Report: 
OpsMgr Custom Reporting Tips & Tricks: 
The OperationsManagerDW Schema: 
OpsMgr Authoring Reporting Guide: 

Creating a Custom Report for SCOM 2007 R2 with SQL 2008 reporting in Microsoft Visual Studio 2008

Most of the times we can rely on the more general reporting features delivered with SCOM 2007 R2 for reporting purposes. But when you want to design a really nice looking report which is much easier to generate and target you need to dive into report designer or even the even more flexible Visual Studio which.



Although the approaches may be the same for most steps in Visual Studio 2008 the reports created in Visual Studio are NOT backwards compatible with Sql Reporting Services 2005. Reports designed with visual studio 2008 can only be used if you are running SQL 2008 reporting services! So for SQL reporting 2005 you should visual studio 2005 instead!
Wouldn’t it be nice to design your own reports with nice looking bars or graphs with your company logo on it ?


Requirements before starting are:
  • Visual Studio 2008 which is delivered with SQL 2008
  • Authoring Console
  • Feeling really artistic!

Before we start with opening the Visual Studio console you will first need to determine which counters you want to generate the report on.
The example will create a custom report for reporting on logicaldisk space. Since this is one of the most asked for and is missing from the default reports.

Counter(s) of interest

We are going to start with % Free Space.
This is the information we need to know to be able to collect the counters:
Counter : % Free Space
Object: LogicalDisk
The other ones are also of interest but to start with we need to know the object and counter to be able to setup a dataset to retrieve the information from the Datawarehouse.

The begin

Open visual studio and follow the steps below to start a new project in which we are going to create the report for disk space.

Go to new and start a new project.
Select a Report Server Project and give it a Name.


After Creating the project we are going to create a report within this project.
Navigate to Solution Explorer on the right side of the screen. 
(if not shown go to view and select Solution Explorer)Right Click on Reports and select Add New Item…


Select Report and give it a Friendly Name and click Add.


Now we have a report in our project.
Next we are going to add an Item to the report.
As stated in the design surface you can add one by using the toolbox on the right or you can right click and select insert.


When you right click and select insert you have the following options you can select.
For our example we are going to select a Matrix.


When the Matrix is selected we need to define a dataset. This is going to be the query we are going to use to query the OperationsManagerDW.
Give it a friendly name and select New next to Datasource.


The Datasource is the Shared datasource the Reporting server uses to connect to the OperationsManagerDW. The Name is “Data Warehouse Main” Unfortunately Visual studio 2008 doesn’t support datasource names with spaces… so we fill in “DataWarehouseMain” instead. We change this afterwards.
Next connection string, simply edit browse your sql server for the OperationsManagerDW.
The connection string to you data warehouse is used to test our reports directly from visual studio.


After Setting up the datasource we can add the query we are going to use.
For ease click Query Designer…
The Query Designer will open and you need to add your query in the box with SELECT FROM


Main Dataset Query

This is the query we are going to use. The SELECT and FROM part are for collecting the data from the Perf.vPerfDaily view which is the view for Daily data from the OperationsmanagerDW.
TIP! If you change vPerfDaily to vPerfHourly you will retrieve the data per hour!
The WHERE part is the part we are filtering the data on. Notice we are going to filter on date @Start_Date and @End_Date which will be the parameters we are going to define. This way we can tell from which date to which date we want to run the report on.
The data is filtered on LogicalDisk which will only retrieve Data from the Object Logicaldisk.
Last we are also filtering on @ServerName which will also be a Parameter we are going to declare for the servername we want to retrieve the data.

Lastly the data is ORDER BY date. This way the data is retrieved in a chronological way.




Well after pasting in the query you can run it by pressing “!”.
It will ask for the parameters fill in a start date / end date and server name (FQDN).
The query designer now shows the data and how it collected the data by joining the tables.

How cool is that and you are starting to look like a developer now already with this cool code and models ;-)
After looking over your result. You will notice we are retrieving more data then we are actually going to use in this example. No worries you will probably have enough inspiration after designing the report to use the other data as well!

When you are done looking at your developer like screen you can click ok and save your dataset settings.




We are going to create 1 more dataset and after that declare the parameters.
First the Dataset, let’s call it DataSet_Servers
Use the following query:
Select DISTINCT vManagedEntity.Path
FROM Perf.vPerfDaily AS vPerf INNER JOIN
vPerformanceRuleInstance ON vPerformanceRuleInstance.PerformanceRuleInstanceRowId = vPerf.PerformanceRuleInstanceRowId INNER JOIN
vManagedEntity ON vPerf.ManagedEntityRowId = vManagedEntity.ManagedEntityRowId INNER JOIN
vPerformanceRule ON vPerformanceRuleInstance.RuleRowId = vPerformanceRule.RuleRowId
Where ObjectName = 'logicaldisk'
This query retrieves all Server Objects to report on.



Declaring the parameters in the last thing before we can start with designing the report.
These are our parameters:
@Start_Date
@End_Date
@ServerName

Go to parameters and go their properties.


At the parameters properties for @End_Date set the data type toDate/Time on the general tab. Next Default Values tab and select Specify valuesThis will set the default values for the parameters when opening the report.For Value click the Function button (fx) and now you are in the expression designer. Go to Common Functions\Date & Time\ and select today(double click).

TIP!Also there are examples how to use this function.You can select each to find what it is used for!
Function for End_Date which should be today 
=Today()
Do the same for @Start_Date with function 
=DateAdd("d",-7,Today())
Finally the @Servers at the General Tab select Multiple values.Next open the Available values tab and select Get Values from Query.
Fill the Dataset we created DataSet_Servers and fill in both Value field and Label field with Path. Simple use the selection to select the dataset and the values.


Now we can start with the design drag and drop the values from the report data pane onto the Matrix like on the right.
You can now run a test report to check which results you get.


When you did test the report you noticed there where many counters and data.
First we going to filter the results since we are interested in the % Free Space for this example.
To filter the results we need to change the Column Group Countername.
Open the group selecting the column and go to the group properties.
Now navigate to Filters. Here we are going to filter our results.
Click Add to add a filter and in expression select [CounterName] use the= sign as Operator and fill in % Free Space.Select Ok and run the report again to check the results. running the report will tell you how the data is presented and give you a quick solution to be able to visualize what you are actually doing.

After Adding the filter we have narrowed the results down to only % Free Space. Except now the result is calculated all results together.


We need to change the results to 1 value which makes sense and probably the most up-to-date one.
Select the cell and right click to open the FX Expression designer again.
 Almost every value in the report has a possibility to add expressions! This makes it very flexible!
Now change the expression to the following:
=Round(Last(Fields!AverageValue.Value))
The Round function is used to round the number so it’s easier to read instead of a PI like number ;-)
The Last function is used to only retrieve the last value. Which is today because we ordered the query on Date we now this is the last number!


The result should now look something like the right picture. By adding the filter on the Countername we only see the counter % Free Space.
And by adding the expression in the details part
=Round(Last(Fields!AverageValue.Value)) we have the results shown.
The screen is just from a test environment and the results are probably different but the basics should look the same.
Computer \ Instances running on the computer and free space.
The next steps will draw the graph in a more nicer way.


The Designing Part

The designing part really depends on your personal favor but I will show you how you can make your report look more sharp in a couple of minutes.

Let’s add some more to the report.
Open the toolbox and drag and drop a Gauge onto the drawing area.

Wow aren’t those meters looking sharp! 

Select the one you want to use and ok.
The meter will be pasted into your design as a new object.
Simply drag and drop your new meter object into your matrix and let it go in the cell with=Round(Last(Fields!AverageValue.Value))



First let’s go to report properties to check and maybe change the page settings of the report.
The values given here you should pay close attention since when designing you should always make sure you stay within these boundaries. Especially when you want the report to be converted to like.pdf.
When you go over the settings the graph will expand multiple pages which isn’t always nice looking.

If you look at the default values here you will notice the width is 8.5 and margins both right and left 1 in this makes your drawing area 6.5 same goes for the height of your page which would be 9.

Make sure your designing area is within 6.5 and 9!
Simply click on your designing area and go to it’s properties if no properties are shown on the left of the screen click F4 to make them appear. Now change your design area to 6.5 by 9. This way you are always sure you are editing within the page limit!

TIP! When you get blank pages it is most of the times because these boundaries are not set correctly!


The matrix should now look something similar. you can expand the matrix now from one point of your designing area to the other and make the gauge more visible.
You can select the different objects which make up the gauge graph. The important one is the bar as shown. For both pointers add the expression
=Round(Last(Fields!AverageValue.Value))This will show the %free space collected value but now on the bar!!

TIP! when designing the bar first make it look like you want and after this you can edit the size. This makes navigation easier ;-)


After playing around endlessly with colors and settings let’s continue we have a report to make  ;-)
Let’s make a page header and footer to hold extra information like title company logo execution time etc.
Right click on your design area to add a page header and footer.Here you can add a text box or image.
In the text box you can also add expressions! Or drag and drop report properties to the text box.
This way you can display logos, a title, creation date, etc.of the report.
Save your report as .rdl file and open the SCOM Authoring Console.


Create a new Management pack and go to the reporting tap. Create a new report and give it a name.
Next go to the Definition tab and select “Load content from file
Navigate to the .rdl file and import it.

Now you will see the xml version of your report in the authoring console.
First remove the first line

After removing this line we need to correct the datasource setting.
Remove the connection settings
Data Source=.;Initial Catalog=OperationsManagerDW
        true
     
And add after  the followingData Warehouse Main.
TIP! Don’t forget to change the options tab the visible setting to True otherwise no report will be shown!!!
Now you can save your report in the management pack. And after this you can import the management pack in your environment.

Result

This post described how to create a report for SQL 2008 reporting with visual studio 2008. Although it takes some time to get the graphical results you want you can now create a custom report.
This report is a simple example of how to create reports using visual studio 2008. I know there are more and possible better solutions and queries to get results but this is for another post ;-)

Kaynak : Oskar Landman