Performance question

Meeting place for R&R customers and clients to share tips and ideas. Post your questions, messages or problems here.
Post Reply
Mike_Brumley_(Guest)
Posts: 12
Joined: Tue Oct 10, 2017 12:44 pm

Performance question

Post by Mike_Brumley_(Guest) » Mon Feb 14, 2005 7:34 pm

I^ve been asked to find out if there^s any way to speed up an xBase report. The report in question is an Auto Policy declaration, just two pages, not very complex and no graphics, so they^re wondering why it takes 16 seconds per report (on a fast single-user machine, preview only.) It^s doing a single scan through the master table^s index, but with hiscope=loscope so it will only find one record. From that find, there are 46 lookups - these hit only 6 physical tables, but many fields (such as drivers, vehicles, addresses) are multiple links to the same table, so tables can have from 1 to 11 aliases (there are 37 aliases in all.) There are two totals, and about 300 calculations.____What I^ve been doing is taking it apart piece by piece and timing the results, trying to see just where all the time is going. A large chunk (30%) was going to the non-indexed scan required for group used for the totals, no surprise there. Another (20%) for the calculations, again no surprise.____Finally the report was down to nothing but the header text, and the scan and lookups, with a few simple calculations to build index keys for some of the lookups - but it still takes about 10 seconds per report, to do table lookups that would probably take < 1 second in C. I^ve seen R&R do about half that many, plus some scans, and print the report, in a lot less time than that, so I^m thinking there^s something I^m missing here.____Is this just a normal runtime for that many lookups? Is R&R performance affected more by aliased tables than non-aliased? Are there any settings that can improve performance in a case like this? Other potential issues I should be looking into? Right now all I can recommend to them is to lose the aliases by writing pre-report code to combine records into temp tables, so I^m definitely open to suggestions...____One last question about the best way to run multiple instances of the same report. Currently they do this through multiple (up to several thousand) entries in the command line control table. Does this approach have a cost in setup/cleanup time in R&R? If so, could it be large enough to be worth the time to structure an index into the master table in such a way that they can use a single control table entry and loscope/hiscope instead?____Thanks,__Mike Brumley__

kfleming
Posts: 5795
Joined: Tue Oct 10, 2017 12:44 pm

=> RE: Performance question

Post by kfleming » Tue Feb 15, 2005 11:11 am

There is definitely some overhead for each relation that exists in a report. I do not think that it matters whether the related table is unique or an aliased copy.____There is also overhead related to the printer that is used for the report.____In improving performance, scoping on the master file to process the fewest number of records and purging any unused calculations should produce the best efficiency. R&R internally tries to think smart and optimize where it can based on the design of the report.____For runtime, it would not matter whether there are multiple control file records or a single re-used record.____Kathleen__R&R Support

rickwjohnson
Posts: 49
Joined: Tue Oct 10, 2017 12:44 pm

==> RE: Performance question

Post by rickwjohnson » Tue Feb 15, 2005 6:11 pm

I can vouch for the slow time in searching for the printer. I had some very simple one page, 100 record reports that took 10-12 seconds to fine the printer until I put "D" -default in my control file(RRWRUNIN). As soon as I did that, my reports would start almost immediately.__Rick Johnson

Nick
Posts: 103
Joined: Tue Oct 10, 2017 12:44 pm

===> RE: Performance question

Post by Nick » Wed Feb 16, 2005 8:26 pm

searching for a printer is a known performance issue when an report is opened by the runtime (or the designer) - and the more printers that exist on a LAN - the longer the wait - __the fix is to set the control file to use DEFAULT as the printer - which stops the searching.____the "saved" printer is often not the printer available to the deployed workstation running R&R - so i just tell my users that R&R will use the DEFAULT printer and then i educate them how to set the DEFAULT.____nick__

Mike_Brumley_(Guest)
Posts: 12
Joined: Tue Oct 10, 2017 12:44 pm

=> RE: Performance question

Post by Mike_Brumley_(Guest) » Mon Feb 21, 2005 12:52 pm

Just wanted to thank everyone for the replies and pass on what I^ve discovered so far.____I had this report emptied to the point that all it did was connect to the tables, do a one-record (since loscope=hiscope on a unique index) scan on the master, and then a group of indexed lookups into the slave tables. So, no real scans to speak of, and just a few string-concat calculations (with all unused calcs purged.) It took over a minute to do that, which was what led me to ask my original question.____So I went back to the full report, running it with the same ten record control file - it took 3:30 to complete. I then added a "PrintMe" field to the master table, with a value of "Y" for those ten records, and created an index identical to the original master index except that it started with that field. Finally, I changed loscope and hiscope to "Y" and the master index to the new index. So now, we^re reporting on the same ten records, but with a single control file record instead of ten, and it^s the scan of the master table that will find the ten matching records.____The time went from 3:30 to 0:20. I don^t know what else to conclude except that there is in fact a subtantial file access overhead, which goes up as the number of tables and aliases increases. Aliases also seem a bit more expensive than tables. I started removing each and timing the result - the more heavily aliased a table was, the greater the impact of removing that lookup & alias. My guess is that R&R is closing and re-opening tables for each line of the control file, in which case these results really aren^t all that surprising. Well, some are - if you do the math, the 1:05 required for the "bare bones" run isn^t nearly enough to span the 3:30 to 0:20 change. However, there are 300+ calculations in this report. That, and Kathleen^s advice to purge unused calcs, makes me wonder - if the tables are closed & reopened for each control file rec, aren^t the calculations also parsed and evaluated each time? Seems to be some overhead there as well, even larger than the file access overhead if there are enough of them.____As for printers, we^re storing the printer name ourselves and using it to set RI_WPTR in the control file, so I don^t think that^s an issue. It wouldn^t be for my own tests, since I always had RI_PRINTER set to "D" (display only.)____Thanks,__Mike Brumley__

Post Reply