Categories
BI Publisher Peoplesoft

Displaying related content on Peoplesoft BI Publisher Crosstab Report

If you need to print related content against each row of a crosstab report, you can do this by assigning an “incontext” variable to the value of the controlling field on the row (typically assigned to “./H”), and then using that as a filter in an XPath query back to the query results e.g.

Assign a variable with the value we are searching for:

<[email protected]:case_id;’. /H’?>

And then use the variable in an XPath filter to find (say) a sibling value from the result set. In this case RC_SUMMARY for the given CASE_ID on the row:

<?(/query/row/CASE_ID[text()=$case_id]/../RC_SUMMARY)[1]?>

This query accesses the first RC_SUMMARY for the given CASE_ID in the query results.

Note the importance of the brackets around the query here … ( ) – this ensures we find just one node. Without them we potentially find repeated RC_SUMMARY values for a given CASE_ID.

NOTE: You could use the XPath sibling functions to find the related value(s), but ‘../’ seems somehow more readable to me.

Categories
Peoplesoft Performance Tuning

SET ARITHABORT ON and PeopleSoft

My previous post focused on the specific issue of poor PS Query performance when ARITHABORT was set to OFF on the connection.

But the fix I suggested for that will also affect all other Peoplesoft connections. That means app servers, process schedulers, app engines, BI Publisher reports …pretty much every PS connection. But that really is what you want ….

So remember:

EXEC sys.sp_configure N'user options', N'64'
GO
RECONFIGURE WITH OVERRIDE
GO

is your friend 🙂

But remember to test it first …. 🙂

Categories
Peoplesoft SQL Server Tuning

Poor Query Performance under SQL Server (PeopleSoft)

Background

A PeopleSoft HRMS 8.8 system running on SQL Server 2008 R2 SP2 that is used as a reporting environment. Data is copied each night from production and a select but limited group of users run queries against it.

The system performance was both terrible and unpredictable. There was a “rule” in place that queries that ran for over two hours were cancelled. Two minutes seemed a long time for the data volumes involved to me.

Analysis

There were lots of things wrong with the instance setup such as default values for MAXDOP and “Cost threshold for Parallelism”, as well as only one TempDB file and unlimited RAM allocated to SQL server. Essentially, an out of the box SQL server install. Not even a Cumulative Update had been applied. In addition, SQL server had 128 GB of RAM and 16 Processors allocated to it so it was hardly short of resources.

But none of that, in itself, really explained the very poor performance of many of the user queries in PeopleSoft. Especially given that running the exact same query in SSMS gave completely different (and quite fast) results. The lack of indexing on the underlying tables and unnecessary cartesian products in the PeopleSoft query security views made the queries less than quick, but they still typically completed in a few seconds or minutes in SSMS. They often never finished in PS query either on-line or scheduled – even after 5 or 6 hours.

Now this database is a very old HRMS 8.8 system that has been upgraded from SQL Server 2000 and various old Peopletools releases up to the giddy heights of 8.49.28 🙂

After much investigation and running SQL Server Profiler along with Adam Machanic’s superb sp_WhoIsActive (read more about it from Brent Ozar here) it turned out to be all related to the setting of ARITHABORT – a well known but I think poorly understood issue.

There are numerous articles on the internet about the issues of SQL running faster in SSMS than via other clients. I leave you to search for those yourself, but to summarise:

You must make sure the connection settings you use in SSMS match those used by the application. Otherwise the query plan you see in SSMS may not bear any similarity to the one the application is using. This makes problem solving nearly impossible.

Profiler will help you identify the options set at the connection level:

ARITHABORT_SQLSERVER

And using:

EXEC sp_WhoIsActive @get_plans=1

will give you the session and the query plan which you can open and compare to the same plan in SSMS. If they are different you need to investigate why.

sp_WhoIsActive_Example

ARITHABORT

ARITHABORT is query plan affecting and a query plan “cache key” – so you will not get the same query plan in SSMS as the client application if the values are different. SQL server will have two plans for the same SQL statement. Remember as well that SSMS issues SET commands as defined in Tools -> Options -> SQL Server -> Query Execution:

 

SQLServerQueryExecutionOptions

ARITHABORT in PeopleSoft

Now the interesting thing about ARITHABORT is that the standard Peoplesoft install scripts create the Peoplesoft database and then issue two ALTER DATABASE commands:

ALTER DATABASE <DBNAME>
SET ARITHABORT ON
GO
ALTER DATABASE <DBNAME>
SET QUOTED_IDENTIFIER ON
GO

where <DBNAME> is your database name.

The install guide also tells you to ensure that QUOTED_IDENTIFIER ON is specified for any client connections to a Peoplesoft database. It goes on to explain that this can be set in the ODBC connection setup. What it fails to mention is setting ARITHABORT for client connections. This should also be set but you cannot set it on the ODBC connection. Best to set it at the INSTANCE connection level:

EXEC sys.sp_configure N'user options', N'64'
GO
RECONFIGURE WITH OVERRIDE
GO

Now assuming you have done that, and you have the default SSMS settings in place, the query execution plan you see for the PeopleSoft application via ODBC should match the one you get in SSMS. And, more importantly the plan that gets chosen will be a much better plan than with ARITHABORT OFF in a great many cases.

Now in this specific case it seems that someone created the original database without using the supplied script and so they did not set either of these default settings at the database. For QUOTED_IDENTIFER they did at least stick with the ODBC connection defaults so that was always ON. But sadly ARITHABORT was not.

The Fix

A quick ALTER database and a re-configure of SQL server, drop the Peoplesoft Application Servers and a SQL Server instance re-start and the problem was solved. It didn’t make every query faster but it did improve the vast majority. The ones it did not fix are typically the ones with missing joins 🙂 Now all I need to do is create some indexes that reflect the way the data is accessed and re-write the query security views to be more efficient and I should be able to get all the queries under two minutes. Well … those with joins between the tables … ….

NOTE: You could and perhaps **should** include QUOTED_IDENTIFER in that user options setting for extra safety (use 320 instead of 64 in the above sp_reconfigure). Just in case someone un-ticks that option whilst setting up ODBC.

Categories
Crystal Reports Off-shoring Peoplesoft

Crystal Reports is slow! Or is it ….?

We had a Crystal Report for Billing that took over an hour to create 230 bills in PeopleSoft 9.1.

Before I continue, I should add that this was a bespoke report developed by off-shore resources. I should also add that I had been doing development QA for quite a while on this project when this issue came up. I was also responsible for application performance analysis and tuning.

I undertook a performance analysis of this report, fully expecting some badly written SQL to be the root cause as that was a common issue with our off-shore resources. However, what I found was even more incredible in my view. To explain:

Categories
APMP PRINCE Project Management

The Key Skills of a Project Manager

You can have all the Prince/APMP qualifications you like but without these….

  1. People Person
  2. Good Listener
  3. Organised
  4. Gravitas/Presence

You might as well not bother.

Which is exactly why I have worked with only 3 really good PM’s in over 25 years in IT.

Categories
EBS ETL Off-shoring Pentaho Peoplesoft

Customers can only have one address …..

During my time converting PeopleSoft data to Oracle EBS, I remember being asked to create a spreadsheet output using Penataho for the dataload of customers with a number of tabs including:

  • Customer Info
  • Customer Addresses

This request came from the “off-shore” resource we had on-shore from India at the time. An EBS “expert” … or so we were told.

I was informed in quite a lot of detail which columns they wanted and some simple transformations/edits they needed performed on the data. The interesting thing about the Customer Addresses data they asked for was that there was no indication of the sequence the addresses should be in, nor how we should indicate (say) the primary address, the delivery address, the billing address etc.

I questioned this and was firmly told “just do it the way they ask for it – they know what they are doing”. I had my doubts.

But I did it.

The next day the EBS “guru” rejected my data file because it had duplicate addresses in it. When I questioned what that meant exactly, the guru said “there is more than one address for a customer”. I pointed out that the data was correct and that there were lots of customers with multiple addresses – in fact most of them had at least two. To which he claimed “in EBS a customer can only have one address”.

I think my face said it all really.

I suggested something along the lines of “RTFM”.

Note: It seems that the conversion approach taken by this off-shoring company was to load the data into staging tables they had created based on the EBS standard data load tables, but with only the fields they “thought” they needed. They then wrote scripts to populate the standard load tables from their customised tables. Clueless. A car crash waiting to happen … and it surely did.

Categories
ETL

Pentaho Data Integration

The project I am currently on is using Pentaho Data Integration (aka PDI or Kettle) to migrate data from Peoplesoft ERP (8.4 SP2) to Oracle eBusiness Suite (11.5). Notwithstanding the obvious absurdity of this move (I am a Peoplesoft person … so I am biased), the Pentaho ETL tool is proving to be truly excellent for the task at hand. Whilst the approach to this data migration is best described as “atypical”, the tool is at least making things easier than they might otherwise be.

I have used PDI on previous projects to migrate Peoplesoft data to Agresso Business World and to migrate Peoplesoft data to Peoplesoft itself – modified of course. Check out the Pentaho website for more details.

 

 

Categories
Workday

Workday Rising

I was fortunate enough to attend Workday Rising 2011 in Las Vegas last month. I managed to combine the trip with a long anticipated holiday in Vegas thanks to a very understanding partner. It was a truly excellent event that reminded me very much of Peoplesoft events in “the old days”. Same old faces in a lot of cases (and not just Dave Duffield!) but the buzz was just like the early 90’s. And as for the Workday software – it’s maturing rapidly into a real giant killer. Watch out Oracle – you are up against it! Check out more about the Workday technology stack here:

 
Workday Technology Stack at dbms2.com