Skip to content

Data Storage

This guide is for developers, explaining how to securely integrate image files into the EO Platform through the complete workflow of "Add Data → List Verification → Detail Confirmation," which involves frontend dialogs, backend APIs, and background tasks.

1. Prerequisites

Before starting the integration, please ensure the following environment is ready:

  • Account Permissions: You need access and write permissions for the "Data Storage" module.
  • Object Storage Permissions: You have configured OBS/S3 compatible upload permissions, or you can use the temporary credentials returned by the platform.
  • Running Services:
    • data_management_service (Java service) is deployed and connected to PostgreSQL.
    • data_process_service (Python, for image parsing tasks) is running correctly and connected to PostgreSQL.

2. Add Data Upload Workflow

Complete Upload Workflow Sequence Diagram

Add Data sequence overview

2.1 Frontend Operations

  1. Click Add Data in the upper right corner of the data storage page to open the dialog.
  2. Choose the source of the file:
    • Local: Drag and drop or select a file from your local machine (single file ≤ 5 GB; for vectors, provide associated files like .shp/.shx/.dbf/.prj).
    • Link: Fill in an external storage URL for the backend to pull asynchronously.
  3. Fill in the form fields:
FieldRequiredDescription
Data NameYesDisplay name in the platform, ≤ 50 characters
Upload to FolderYesTarget directory, e.g., Data Storage
CategoryYesBusiness category (Satellite, Other, etc.)
TypeYesData type (Raster, Vector, 3D Model, BIM, etc.)
Capture TimeNoCapture time, YYYY-MM-DD HH:MM:SS recommended
Sensor / ProviderNoSensor and provider information
  1. Click Save to call the appropriate pre-upload API, such as POST /datafile/file/pre-upload/raster.

Upload workflow sequence diagram

  • Basic validation

    • uploadMethod: 0 = Local, 1 = Link
    • Required fields: Link requires a valid link
    • Size enforcement: 0 < size < 5 GB (checked in beforeUpload and again on submit)
    • Form payload: fileName, parentId (resolved from the folder tree), category, extension, fileSize
    • Optional fields: captureTime (convert milliseconds to seconds), cloudOver, sensor, imageLevel
    • Extension detection: Local derives from the file name; Link derives from the URL path suffix
  • Local upload

    1. Call the pre-upload API based on fileType: (raster=0/vector=1/model=2/other=3/bim=4/pointcloud=5/document=6).
    2. Use the returned uploadUrl and send a PUT request to upload the file. refer to https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob-from-url?tabs=microsoft-entra-id
    3. After the upload succeeds, call the corresponding upload-success endpoint (raster=0/vector=1/model=2/other=3/bim=4/pointcloud=5/document=6).
  • Link upload

    1. Reuse the same pre-upload routing as Local based on fileType.
    2. Once pre-upload succeeds, close the dialog and poll GET /datafile/link/upload/process every 5 seconds to update uploadedStatus and failedMsg, while emitting link-upload progress events.
  • Directory selection: Resolve the chosen folder tree node to parentId (root = -1) and show the "Data Storage / …" path in the form.

  • Other interactions: Auto-fill dataName from the first selected file (≤ 50 characters); enable canSave only when required fields are satisfied; call POST /datafile/file/edit for rename or move operations.


2.3 API Call Sequence

The table below uses Raster(0) as the reference flow. For other types, switch the :type suffix accordingly.

StepAPIKey Return Value / ParameterDescription
POST /datafile/file/pre-upload/:typeuploadUrlCreate an upload url
(Use the returned uploadUrl and send a PUT request)Upload File, refer to https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob-from-url?tabs=microsoft-entra-id
POST /datafile/file/upload-success/:typedataFileIdNotify the server to validate and ingest
HTTP 200Return success and refresh the list

Error handling: If any API returns 4xx/5xx, stop the workflow, surface the message to the user, and consult the Error Codes Reference.


  1. The frontend first calls POST /datafile/file/pre-upload/raster to receive uploadUrl.
  2. Use the returned uploadUrl and send a PUT request. refer to https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob-from-url?tabs=microsoft-entra-id
  3. After the upload, the frontend calls the save API POST /datafile/file/upload-success/raster, sending fileId, file checksums/size, and form fields to persist the record.
  4. The server writes to PostgreSQL and dispatches asynchronous tasks:
    • .tif/.tiff imagery triggers a Python job to extract spatial extent, projection, resolution, cloud cover, etc.
    • Parsed metadata is stored for later list/detail queries.
  5. When the API returns success, show "Add Data successful" and refresh the list.

3. Verifying Ingestion Results in the List

  1. Successfully ingested records appear in the list, sorted by creation time (descending).
  2. Use the left filter panel to locate entries quickly:
Filter ItemFieldDescription
CategorycategoryBusiness category
TypefileExtensionFile extension
Capture TimecapTimeStart / capTimeEndCapture time range
Maximum Clouds on ImagecloudCoverCloud cover (0–100%)
  1. List columns mirror the response from POST /datafile/query/page; you can reuse the API for automated validation.
  2. If parsing is still in progress, the status column may show "Processing"; once complete, cloud cover and spatial extent are updated.

The refresh logic relies on the pagination API. If the newest data is missing, refresh or reissue the query.


4. Detail Verification and Metadata Confirmation

  1. Click the data name or "Details" to open the drawer/page.
  2. Detail content comes from GET /datafile/query/file/details, including:
    • Basic attributes: Name, category, type, size, uploader, capture time.
    • Spatial info: boundingBox, coordinate system (EPSG), resolution (written by the Python task).
    • Business metadata: Cloud cover, tags, linked datasets or analysis tasks.
    • History: Processing status, publishing timeline, operation logs.
  3. If parsing fails, inspect the failure reason or retry entry in the UI, and cross-check backend logs.

5. API Quick Index

CapabilityAPIDescription
Pre-uploadPOST data-storage/pre-upload-rasterReturns upload credentials and uploadId
Upload CompletePOST data-storage/upload-success-rasterNotify server to validate and ingest
Link Upload ProgressGET /data-storage/link-upload-processPoll external link upload status
Paginated QueryPOST /data-storage/query-data-file-pageList pagination and filtering
Query File DetailsGET /data-storage/query-file-detailsGet file metadata details
Edit File InfoPOST /data-storage/edit-fileRename file, move path, add description

6. Debugging and Troubleshooting

  • OBS Upload Failure: Check whether temporary credentials have expired, multipart parameters are correct, or inspect the browser console for details.
  • Record Not Ingested: Confirm the pre-upload stage succeeded; inspect the database and task queue if not.
  • Missing Metadata: Review Python task logs to verify the source GeoTIFF carries projection info; rerun extraction if necessary.
  • API 401/403: Ensure the Authorization header carries a valid token and the account has access rights; refer to the Error Codes Reference.

On any exception, the service returns an error code. Use the Error Codes Reference to troubleshoot.