How to Use Spark on Your Minecraft Server
Spark is a lightweight and powerful performance profiling plugin that helps server owners identify lag sources, optimize resource usage, and troubleshoot potential problems. Whether you're managing a small private server or a large community, Spark provides detailed insights into server performance metrics like tick rates, plugin efficiency, and system resource usage.
In this guide, we’ll walk you through the steps to install, configure, and effectively use Spark to optimize your Minecraft server.
Note: Spark only works on Forge/Fabric servers.
To install Spark on a Minecraft server:
Go to the downloads webpage of the Spark mod, search for your current Minecraft server version. Check the title of the download name to make sure that the Spark version is the correct version for your server software. (Fabric or Forge)
After downloading Spark, navigate to the File Manager.
Navigate to the mods folder, drop the Spark plugin folder into the mods folder.
Restart your server in the console.
After doing the installation process, use the "**spark profiler**” command in console, as shown on the image.
Wait for 10 minutes after using this command, then use the “ spark profiler –stop” command.
After using the stop command, you will be given a link in the console. After clicking the link, you will be redirected to a page with your tick times report.
To start a reading and manage a reading using Spark, you are able to do the following commands:
/spark profiler start - to start the profiler in the default operation mode.
/spark profiler stop - to stop the profiler and view the results.
/spark profiler info - to check the current status of the profiler.
There are some additional flags which can be used to customize the behaviour of the Spark profiler. You can use:
/spark profiler start --timeout <seconds> - to start the profiler and automatically stop it after x seconds.
/spark profiler start --thread * - to start the profiler and track all threads.
/spark profiler start --alloc - to start the profiler and profile memory allocations (memory pressure) instead of CPU usage.
If you currently are in a ticket due to server lag, feel free to hand over the timings report link to our staff team, otherwise, a more in depth explanation of the profiler can be found at the Spark documentation page.
If you require any further assistance, please create a ticket here.
Created By: Daniel R.
Edited By: Mason Baker
In this guide, we’ll walk you through the steps to install, configure, and effectively use Spark to optimize your Minecraft server.
Note: Spark only works on Forge/Fabric servers.
How to Install Spark on a Minecraft Server
To install Spark on a Minecraft server:
Go to the downloads webpage of the Spark mod, search for your current Minecraft server version. Check the title of the download name to make sure that the Spark version is the correct version for your server software. (Fabric or Forge)
After downloading Spark, navigate to the File Manager.
Navigate to the mods folder, drop the Spark plugin folder into the mods folder.
Restart your server in the console.
How to Use The Spark Plugin
After doing the installation process, use the "**spark profiler**” command in console, as shown on the image.
Wait for 10 minutes after using this command, then use the “ spark profiler –stop” command.
After using the stop command, you will be given a link in the console. After clicking the link, you will be redirected to a page with your tick times report.
How to Use Spark to Create a Timings Report
To start a reading and manage a reading using Spark, you are able to do the following commands:
/spark profiler start - to start the profiler in the default operation mode.
/spark profiler stop - to stop the profiler and view the results.
/spark profiler info - to check the current status of the profiler.
There are some additional flags which can be used to customize the behaviour of the Spark profiler. You can use:
/spark profiler start --timeout <seconds> - to start the profiler and automatically stop it after x seconds.
/spark profiler start --thread * - to start the profiler and track all threads.
/spark profiler start --alloc - to start the profiler and profile memory allocations (memory pressure) instead of CPU usage.
If you currently are in a ticket due to server lag, feel free to hand over the timings report link to our staff team, otherwise, a more in depth explanation of the profiler can be found at the Spark documentation page.
Still have some questions?
If you require any further assistance, please create a ticket here.
Created By: Daniel R.
Edited By: Mason Baker
Updated on: 15/01/2025
Thank you!