Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide the ability to upload patches #78 #104

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

ChiuchiuSorin
Copy link
Collaborator

Created new endpoints for the package upload operation and refactored the healthCheckJob code to be usable by multiple job.

The import package page:
importPage

The import package result:
packageUpload

The modification for file section:
filesSection

* created upload job request and status
* renamed HealthCheckResult to CustomJobResult and HealthCheckResultLevel to CustomJobResultLevel
* created a new rest endpoint for package upload with the role of instantiating a job with the given reference and time stamp
* added a new endpoint to ServerInfo interface for retrieving the XWiki installation path
* implemented the logic of package archive upload which includes: original file search and backup creation, batch file save (file replacement), batch backup restore in case of an error
* created the ImportPackage page for user interface
* created the hidden PackagesStore for archive storage
* created liveData table for saving a package and starting the upload process
* created liveData table (with PackagesJSON document) for showing the already uploaded packages
* created a template for executing and showing the progress of a package upload
* code refactoring
* added transalations
# Conflicts:
#	application-admintools-ui/pom.xml
* remove packagesStore document
@ChiuchiuSorin ChiuchiuSorin self-assigned this Oct 10, 2024
@ChiuchiuSorin ChiuchiuSorin linked an issue Oct 10, 2024 that may be closed by this pull request
@ChiuchiuSorin
Copy link
Collaborator Author

There are no tests added yet, I will add them in a future commit.

* created upload job request and status
* renamed HealthCheckResult to CustomJobResult and HealthCheckResultLevel to CustomJobResultLevel
* code refactoring
@ChiuchiuSorin
Copy link
Collaborator Author

This PR depends on modifications made in #107.

* refactored CustomJobResult to JobResult
* refactored CustomJobResultLevel to JobResultLevel
* refactored CustomJobResult to JobResult
* refactored CustomJobResultLevel to JobResultLevel
# Conflicts:
#	application-admintools-ui/src/main/resources/AdminTools/Code/Translations.xml
# Conflicts:
#	application-admintools-ui/src/main/resources/AdminTools/Code/Translations.xml
* added unit tests for UploadJob
* refactored UploadJob in UploadJob and UploadJobFileProcessor
*/
public PackageUploadJobRequest()
{
setDefaultId();
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure we should do this. If we do have multiple jobs with the same id, we should be mindful that only one upload job should run at a time. Im not sure what is the behaviour of the job manager/executor if you want to execute two jobs with the same id concurrently. If you can, the job logs will probably be overwritten in the permanent directory. The job progress might also act up.

On a second thought, one might not be interested to save ALL the logs of a potentially really big number of uploads (depends on your use case). If Admin tools is installed as root, it should be ok to have only one id for all the upload jobs. Otherwise, it might be better to have an id per wiki. But i still think we should take precaution and not run these jobs simultaneously.

Copy link
Collaborator Author

@ChiuchiuSorin ChiuchiuSorin Feb 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Admin tools is used as root, and for the upload jobs that I'm creating I'm using the attachment reference and startTime of the job to create the job ID, this way the jobs are unique. If you think it would be safer to ensure that only one job at a time is running, I could create a general ID and check if there is any job running for said ID. The issue with this is that the logs will be overwritten. To prevent multiple jobs with the same file reference running at the same time, I'm using JobGroupPath which will prevent two jobs with the same file reference to run at the same time.

@Unstable
public class PackageUploadJobStatus extends DefaultJobStatus<PackageUploadJobRequest>
{
private List<JobResult> uploadLogs = new LinkedList<>();
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

* moved save logic from UploadJob to UploadJobFileProcessor
* created a method to save the backups in a special backup folder inside XWiki permanent directory
* removed the batchSave function
* adapted tests
* modified and added comments
* added translations
* added more job logs
* code refactoring
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Provide the ability to upload patches
2 participants