Categories
App Engine Peoplecode Peoplesoft

Where you do CreateRowset does matter …..

In Peoplecode, it is common to see this construct:

1
Local Rowset &rs = CreateRowset(Record.RECNAME);

Often, this construct is used within a loop. But this is not a “free” statement – it goes to the database:

1
SELECT {column_list} FROM {tab_name} WHERE 1=2

Which, on SQL server would get wrapped in a SET FMTONLY ON / SET FMTONLY OFF statement pair in order to get the column META-DATA (describe output).

But if you use the above construct inside a loop, then there will be “n” executions of the above SQL – so “n” database round trips too. This is not free and will impact your performance.

Better to define the rowset at the component level and re-use inside the loop.

PS: The same applies to CreateRecord(Record.RECNAME). Think about where you place these statements in your code – especially in App Engine code where you are likely to be looping through rows..

Categories
Crystal Reports Off-shoring Peoplesoft

Crystal Reports is slow! Or is it ….?

We had a Crystal Report for Billing that took over an hour to create 230 bills in PeopleSoft 9.1.

Before I continue, I should add that this was a bespoke report developed by off-shore resources. I should also add that I had been doing development QA for quite a while on this project when this issue came up. I was also responsible for application performance analysis and tuning.

I undertook a performance analysis of this report, fully expecting some badly written SQL to be the root cause as that was a common issue with our off-shore resources. However, what I found was even more incredible in my view. To explain:

Categories
EBS ETL Off-shoring Pentaho Peoplesoft

Customers can only have one address …..

During my time converting PeopleSoft data to Oracle EBS, I remember being asked to create a spreadsheet output using Penataho for the dataload of customers with a number of tabs including:

  • Customer Info
  • Customer Addresses

This request came from the “off-shore” resource we had on-shore from India at the time. An EBS “expert” … or so we were told.

I was informed in quite a lot of detail which columns they wanted and some simple transformations/edits they needed performed on the data. The interesting thing about the Customer Addresses data they asked for was that there was no indication of the sequence the addresses should be in, nor how we should indicate (say) the primary address, the delivery address, the billing address etc.

I questioned this and was firmly told “just do it the way they ask for it – they know what they are doing”. I had my doubts.

But I did it.

The next day the EBS “guru” rejected my data file because it had duplicate addresses in it. When I questioned what that meant exactly, the guru said “there is more than one address for a customer”. I pointed out that the data was correct and that there were lots of customers with multiple addresses – in fact most of them had at least two. To which he claimed “in EBS a customer can only have one address”.

I think my face said it all really.

I suggested something along the lines of “RTFM”.

Note: It seems that the conversion approach taken by this off-shoring company was to load the data into staging tables they had created based on the EBS standard data load tables, but with only the fields they “thought” they needed. They then wrote scripts to populate the standard load tables from their customised tables. Clueless. A car crash waiting to happen … and it surely did.