Scroll

Reporting off of the Logfile

Part of the Performance and Scaling series.

In a previous article, we wrote about how you could examine the Exago Log file to determine the time taken for network calls to the database. You can use the same method to determine overall report execution time, as well as time for each step of the data processing. This can be helpful for finding inefficiencies or bottlenecks in the execution process.

However, this time, instead of opening the log file in a text editor, we will explore how you can write a custom data source assembly to pull the log file into Exago BI itself. This way you can report on performance data in a tabular or visual manner.

log.png

Reporting off of the Exago BI Logfile

An assembly example is provided at the end of this article that will work out of the box and can also be customized for your own needs.

Background

The ReportExecuteEnd events log the total report execution time in the detail information:

2017-10-11 17:32:17,101 INFO  [4xzemyj3rzt4kq4rnevram45] [Api.Execute.ExecuteReport.Process] ReportExecuteEnd: Name: Examples\C1, UserId: , CompanyId: , ExecuteTime: 00m:04s

This isn't itemized, however, and is therefore of marginal use. The real meat of the data is in the timestamp that appears before each line, which records event time down to the millisecond. Parsing this into the application will allow us to perform some much more detailed analysis.

In order to dig into execution data, the application logging should be set to record DEBUG level information. This can be set in the log4net.config or, as of v2017.2, in the Admin Console/configuration file.

We can categorize rows by process and then use a vertical table transformation to determine the time it takes events to occur. This will require us to do some logical parsing of the log file. Thankfully, the logfile format is relatively consistent, so this is not overly challenging.

Parsing the Log

Each new event in the log file is preceded by a timestamp on a new line. This makes it easy to go line by line and associate events into rows. Additionally, rows have a consistent format:

Timestamp LEVEL [alphanumeric] [EventType] Details

(The [alphanumeric] is irrelevant / some rows do not have an [EventType])

And the format of the Details is consistent between events, with many events in the format:

Process: Information

Some events have detail information that could be parsed further if desired.

The only thing we are missing is an Id value to tie together rows for an execution. Thankfully, we know that ReportExecuteStart bookends the start of a set of execution events. We can assign an id to each row and then increment it each time we reach a ReportExecuteStart row. (Not every row is related to an execution, so we could do some additional processing to avoid assigning ExecutionIds to such rows).

The result is a vertical table that we can query using LINQ or SQL in order to do some math on execution times for different stages in the process.

Example

This example demonstrates what was discussed and can be used as-is or as a starting point for reporting off of execution timings in Exago BI.

Download the example here.

Note: The web application logfile is saved in the Temp directory by default, which is purged occasionally. It is recommended to create a custom log4net.config file to save the logfile in a separate location, especially for reporting off of long-term data.


Hidden Article Information

Article Author
Exago Development
created 2017-10-24 19:31:32 UTC
updated 2017-10-24 21:49:12 UTC

Labels
data, log, logfile, custom, database, execution, performance, assembly, source, timestamp, runtime,
Have more questions? Submit a request