FAQ

Contents


Q: How do I include credentials in my data archive pipeline JAR file?

A: Adding credentials in the data archive pipeline JAR file is highly discouraged for security reasons. The platform manages the credentials of the pipeline on behalf of the user. To understand more about credentials, see:

Identity & Access Management Guide

Q: How do I search application logs?

A: Refer to:

Search Application Logs

Q: How do I monitor my data archive pipeline?

A: Refer to:

productName variable is missing Metrics

Pipeline Monitoring

Q: Are there any system metrics for my data archive pipeline?

A: The Data Archiving Library internally uses the Flink Streaming Execution Environment. The available system metrics are described in:

Flink Metrics

To access the dashboard for these system metrics, in the HERE platform portal, click Tools, then Monitoring and Alerts. Click Home, then search for Flink Metrics and click on the dashboard. You can select your Pipeline ID and see the metrics.

Q: Why are messages not being archived?

A: A message will not be archived and the Data Archiving Library will continue processing the next message if:

  1. The getKeys(), getMultipleKeys() or getSplittedKeys() method throws exception, or returns null.
  2. The attribute value data type does not match the data type defined in index definitions. (For example, timewindow provided as string, and so on.)
  3. The values of the timewindow or heretile attributes are null.
  4. The heretile value does not match the zoomLevel provided in index definitions.

Note

If the value of attributes, other than timewindow and heretile, is null, then the message will be archived.

For detailed information, see Validation Rules for Indexing Attributes

Q: How do I investigate data archiving pipeline that has failed?

A: There is a logging URL attached to your pipeline version. When you click it, you will be taken to Splunk. From here you can start investigating the failure. To check if your pipeline failed because of the Data Archiving Library, you can edit the search query by appending additional filters. For Data Archiving Library error filters, see the following example.

index=olp-here-test_common namespace=olp-here-test-j-d07b5ee2-cbbd-48f6-b0e3-5b0d87703fed "*DAL:*"

Note

The Data Archiving Library fails the pipeline if a defined indexing attribute is missing in the map returned from the getKeys(), getMultipleKeys() or getSplittedKeys() method.

  1. The timewindow attribute value must be represented as a long data type (epoch time in milliseconds).
  2. The heretile attribute value must be represented as a long data type (HERE tile id).

If no results are found, then the cause of your failure may not be the Data Archiving Library itself but rather one of the services on which the library depends:

  • Pipeline
  • Flink Runtime
  • Stream Layer
  • HERE Account

For further troubleshooting information, see:

Troubleshooting

A: Refer to:

Troubleshooting

Q: If I use the reference examples, how do I parse the output content after running my data archive pipeline?

A: To get binary data, see:

Get Data from Index Layer

Below are some sample Java snippets that parse the output from reference examples.

For avro-example,

byte[] data = getBlobStoreData(datahandle)
Set<ArchivedMessage> setOfBlobMessages = new HashSet<>();
try {
    setOfBlobMessages.addAll(AvroHelper.fromStream(new ByteArrayInputStream(data), ArchivedMessage.class));
} catch (Exception e) {
    LOG.error("Unexpected error", e);
}

For parquet-example,

byte[] data = getBlobStoreData(datahandle);
List<SdiiMessage.Message> sdiiMessages = new ArrayList<>();
try {
    Path tmpDir = Files.createTempDirectory("parquetTmp");
    tmpDir.toFile().deleteOnExit();
    Path parquetTmpFilePath = tmpDir.resolve(UUID.randomUUID().toString());
    Files.write(parquetTmpFilePath, data);
    ProtoParquetReader<SdiiMessage.Message.Builder> reader = new ProtoParquetReader<SdiiMessage.Message.Builder>(new org.apache.hadoop.fs.Path(parquetTmpFilePath.toString()));
    while (true) {
        SdiiMessage.Message.Builder sdiiMessageBuilder = reader.read();
        if (sdiiMessageBuilder == null) {
            break;
        } else {
            sdiiMessages.add(sdiiMessageBuilder.build());
        }
    }
    reader.close();
    parquetTmpFilePath.toFile().delete();
} catch (Exception e) {
    LOG.error("Unexpected error", e);
}

For protobuf-example,

byte[] data = getBlobStoreData(datahandle);
List<SdiiMessage.Message> sdiiMessages = new ArrayList<>();
try {
    SdiiMessageList.MessageList messageListProtoBuf = SdiiMessageList.MessageList.parseFrom(data);
    sdiiMessages.addAll(messageListProtoBuf.getMessageList());
} catch (Exception e) {
    LOG.error("Unexpected error", e);
}

↑ Top

results matching ""

    No results matching ""