Cloud and on-premise
What is available to you also depends on the service model, SAAS, or on-premise.
If you are using Planon SAAS, the backend runs in a cloud infrastructure (like Amazon AWS or Microsoft Azure). The Planon Ops team ensures that your environment is provided with sufficient hardware capabilities, such as CPU and RAM, and adequate infrastructure settings.
For example, the Ops team has options to assign more hardware, implement multiple servers, and use a dedicated instead of a shared (database) server. SAAS customers do not have the ability to tune these infrastructure and hardware capabilities, nor do they need to do so.
This section discusses options that are available for both models.
In a Planon Cloud environment, performance can be improved by creating custom indexes on the database. In a cloud environment, you can create database indexes in Field Definer, with the help of Planon support. This saves you the trouble of involving a database specialist to manually add them to the database. See the online
WebHelp for more info. In an on-premise environment, the customer’s DBA can create indexes.
Software Configuration
This section describes many options that the administrator can use to influence performance.
Drill down selection
Although it is possible to select twenty thousand records and drill down to their details, the performance will be poor. The admin can help the end users by creating appropriate filters and step filters.
Records per page
You can specify the maximum number of elements that are displayed in the elements list. The lower the number, the faster the list is displayed. The maximum number is five hundred [
WebHelp link].
Scheduled processes
When a scheduled process is running, it will consume system resources, which could result in decreased response times for end users. If possible, it is advised to run scheduled jobs outside of business hours.
Generating reports
Some reports are generated by processing a lot of data. Such reports consume considerable amounts of application server memory. If possible, it is advised to execute such reports outside of business hours.
Polling frequency
Many background processes poll data from other sources at a certain interval. For example, cost centers can be imported from SAP. Another example is the polling frequency for measurement points (IoT sensors). Consider if such frequencies are not higher than required.
Import/export chunks
Import-/export documents are processed in
chunks. You can define the size of these chunks by specifying the Fetch size. Using chunks can prevent transaction time-outs. The chunks will be processed as separate transactions within the same run. This feature applies to both manual and scheduled runs [
WebHelp link].
Authorization association links
When two business objects are linked using an m-to-n relation, the second business object (N, on which the Authorization filter is created) can be made available as an Association link to the first (M). Although this is a powerful mechanism, with authorization links on reference fields and on associations it is possible to go two directions and create long chains. These chains are complex and slow queries. The advice is to be conservative when using association links [
WebHelp link].
Mail merges
When generating mail merges, consider the size of the included content. If each mail is enriched with multiple images of several MB, the performance will be slower compared to a mail merge of a small size.
File storage
The file pop-ups can display a lot of details of files (like size, and date). The more files are in one directory, the slower the performance. Consider if a more fine-grained directory structure could be implemented. You can automatically create a logical file structure on the file system, by enabling the
Planon managed functionality [
WebHelp link].
Business Analytics usage
For large reports, where the information does not need to be processed in real-time, it might be better to use our data lake solution with MongoDB. Planon’s Connect for Analytics extends the operational Planon system with a data lake and a data connector to external data analytics tools. This simplifies the extraction of data for reporting, analytics, and predictions with standard BI tools [
WebHelp link].
CAD Integrator
For large CAD drawings, where all the details in construction drawings are not required, it might be better to consider the ‘Level of detail’ field in the ‘Floor attributes’ business object [
WebHelp link].
Additionally, an automated system is in place that converts larger .svg construction drawings to image format, enabling faster loading times for CAD viewer drawings [
WebHelp link]. Currently, space administrators cannot restrict this conversion. However, starting with release L110, there will be a setting in 'Floor attributes' BO that allows administrators to choose whether to proceed with this automatic conversion or not.
| Automatic conversion will not be applied to Kiosk. |
Data volume
The larger a database table, the slower a query. Consider if you could delete old unused records. (If you need them for future reference, you might export them or store a database backup). An option is to clean certain tables via a scheduled task. This approach is implemented by default for certain records (for example, old session data, old event logs, old tasks, processed inbound and outbound messages) [
WebHelp link].
Event connector
Messages that get stuck are interrupted (since L106) given the timeout configured in System Settings. For best scalability, this timeout should be set as low as possible. The performance logs can be used to get information on the timing of Event connector solutions [
WebHelp link].