How to Upload Data to Repository

You can upload data to repositories in which the Timeline API data source is enabled. For example, you have separate data managers whose tasks are only to upload tables to Timeline repositories for further processing, and they need neither access to other features of the program, nor any other specific actions in Timeline interface. In this case, they can do it through a series of Timeline endpoint calls.
Note. For information on where to find path parameter values for the endpoints, see Repository path parameters.

Before you begin

Make sure that the target table is created in the needed repository.
Important. At the moment, it is impossible to create a new table using API requests.

You may want to obtain some information about available repositories and tables in them, or create a completely new repository:

  • Send a GET request to obtain the list of available repositories, where the Timeline API data source is enabled.
    Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/repository

    In response, you will receive a list of available repositories, from which you can copy the ID of the needed one.

    Show CURL example

  • Send a GET request to obtain the list of tables existing in the repository.
    Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/repository/{repositoryId}/table

    Required parameters: repositoryId
    In response, you will receive a list of tables available to you in this repository. You can copy the ID of the needed table from this list.

    Show CURL example

  • Send a POST request to create a new repository with Timeline API data source enabled. Provide the name for the new repository in the request body.
    Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/repository

    Request body example:
{
  "name": "Repository-New"
}
    

In response, you will receive:

  • id - the identifier of the new repository.
  • name - the name of the new repository.
  • role - your role in the new repository. Then creating a repository, you obtain the Owner role in it by default.

    Show CURL example

Uploading data

  1. Send a POST request to generate and retrieve an upload URL to transmit a file. Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/repository/{repositoryId}/file/upload-url

    Required parameters: repositoryId
    Request body example:
{
  "fileName": "File_to_Upload_Name.csv"
}
    

In response, you will get a URL that will be a temporary storage for uploaded files. The files will be loaded to Timeline from this storage.
Also, copy the key and value of the response header:

"headers": {
    "x-amz-acl": "bucket-owner-full-control"
}
    
  1. Send a PUT request to the URL generated in the previous step to upload the file there. Include the headers received along with the URL and provide the raw file data as the request body.
  2. After the upload is finished, send a POST request to trigger final processing on the uploaded file and load it into the target repository. Provide the file key received in the previous step in the request body.
    Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/repository/{repositoryId}/load

    Required parameters: repositoryId
    Request body example:
{
  "fileKeys": [    
"repository/1/1625641920000/File_to_Upload_Name.csv",
  ],
  "tableName": "data" 
}
    

You will receive a processingId in response.

  1. Send a GET request with the processingId returned by the previous request. Endpoint:
    {your.timeline.instance}.com/api/ext/1.0/processing/{processingId}

    Poll the processing endpoint until the status becomes FINISHED.
    As a result, the file will be loaded, and a new table with the specified name will be created in the specified repository.
    Note. If a table with the provided name already exists, all data from that table will be deleted and overwritten by the new data.

Examples

For Windows

For Linux

22.02.2024 17:28:05

Usage of Cookies. In order to optimize the website functionality and improve your online experience ABBYY uses cookies. You agree to the usage of cookies when you continue using this site. Further details can be found in our Privacy Notice.