How to create a Spark timings report (Plugin)
Spark is a plugin that diagnoses performance issues, memory issues, and keeps track of the overall health of the server. This article will teach you how to setup and use spark on your Minecraft server.
Installing Spark
Using Spark
Note: The Purpur server software already has Spark implemented. There is no need to install the plugin to your Minecraft server.
Head over to the Spark spigot page here. Then click the "Download Now" button.
Once downloaded, head over to the "File Manager".
Find the "plugins" folder and upload the JAR that you have just downloaded. To do this, click the yellow upload button.
Now restart your Minecraft server. The plugin should now start successfully.
To start a reading and manage a reading, you are able to do the following commands:
/spark profiler start - to start the profiler in the default operation mode.
/spark profiler stop - to stop the profiler and view the results.
/spark profiler info - to check the current status of the profiler.
There are some additional flags which can be used to customize the behaviour of the profiler. You can use:
/spark profiler start --timeout <seconds> - to start the profiler and automatically stop it after x seconds.
/spark profiler start --thread * - to start the profiler and track all threads.
/spark profiler start --alloc - to start the profiler and profile memory allocations (memory pressure) instead of CPU usage.
If you require any further assistance, please create a ticket here.
Created By: Greg K.
Table of Contents
Installing Spark
Using Spark
Installing Spark
Note: The Purpur server software already has Spark implemented. There is no need to install the plugin to your Minecraft server.
Head over to the Spark spigot page here. Then click the "Download Now" button.
Once downloaded, head over to the "File Manager".
Find the "plugins" folder and upload the JAR that you have just downloaded. To do this, click the yellow upload button.
Now restart your Minecraft server. The plugin should now start successfully.
Using Spark
To start a reading and manage a reading, you are able to do the following commands:
/spark profiler start - to start the profiler in the default operation mode.
/spark profiler stop - to stop the profiler and view the results.
/spark profiler info - to check the current status of the profiler.
There are some additional flags which can be used to customize the behaviour of the profiler. You can use:
/spark profiler start --timeout <seconds> - to start the profiler and automatically stop it after x seconds.
/spark profiler start --thread * - to start the profiler and track all threads.
/spark profiler start --alloc - to start the profiler and profile memory allocations (memory pressure) instead of CPU usage.
If you require any further assistance, please create a ticket here.
Created By: Greg K.
Updated on: 19/12/2023
Thank you!