Include the SDK in your project

Applications which run in the HERE Workspace may have dependencies on the following components:

  • HERE Data SDK for Java & Scala libraries, such as the Location Library, the Data Processing Library, or the Data Client Library.

  • Libraries provided as part of a runtime environment. The Workspace supports two runtime environments: batch and stream. The batch environment provides a minimal amount of libraries that Apache Spark applications use. The stream environment provides libraries that the Apache Flink applications use.

  • Protobuf schemas for encoding and decoding data, like schemas for HERE Map Content, HERE Weather, or those which are user provided.

For all mentioned types of dependencies, the HERE Data SDK provides different methods of managing these dependencies.

HERE Data SDK libraries

The libraries within the Data SDK have different versions. For example, Data Processing Library 3.1.13 and Data Client Library 0.2.24 are part of the same Data SDK release. Also, there are interdependencies within the Data SDK, for example the Location Library and the Data Processing Library depend on the Data Client Library, which you must consider when managing dependencies.

To help with dependency management, the Data SDK provides Bill Of Materials (BOMs) files. Each BOM is a Maven POM file which lists compatible versions of these libraries.

There are three different BOM files:

  • Scala 2.12:
    • sdk-batch-bom_2.12.pom - This file contains compatible versions of the HERE Data SDK for Java & Scala libraries, their dependencies and the libraries provided by the HERE Workspace Batch 3.0.0 runtime environment (Apache Spark 2.4.7). See SDK Libraries for Batch Processing with Scala 2.12.
    • sdk-stream-bom_2.12.pom - This file contains compatible versions of Data SDK libraries, their dependencies, and the libraries provided by the HERE Workspace Stream 5.0 runtime environment (Apache Flink 1.13.5). See SDK Libraries for Stream Processing with Scala 2.12.
    • sdk-standalone-bom_2.12.pom - This file contains compatible libraries of Data SDK and their dependencies. Use this POM file for running the application on your own environment. See SDK Libraries for usage in Standalone mode with Scala 2.12.

Notice that you need repository credentials to resolve dependencies.

Include BOMs

You can include the BOM files in your project POM file in two different ways:

  1. As parent POM file (recommended):

    <!-- Inherit all properties, dependency management and plugin management from the SDK -->
    <parent>
    <groupId>com.here.platform</groupId>
    <artifactId>sdk-batch-bom_2.12</artifactId>
    <version>2.54.3</version>
    </parent>
    
  2. Or, if your project already has a parent, use import for a dependency:

    <dependencyManagement>
    <dependencies>
     <dependency>
        <!-- Import dependency management from the SDK, but properties and plugin management are skipped -->
        <groupId>com.here.platform</groupId>
        <artifactId>sdk-batch-bom_2.12</artifactId>
        <version>2.54.3</version>
        <type>pom</type>
        <scope>import</scope>
     </dependency>
    </dependencies>
    </dependencyManagement>
    

Note

Examples
All code snippets use sdk-batch-bom_2.12 as an example, but the same instructions are applicable to sdk-stream-bom_2.12 and sdk-standalone-bom_2.12. Be aware that only one SDK BOM file should be used in a project.

We recommend including the BOM files as a parent because the files contain a plugin management section and useful properties.

For example, the plugin management section of sdk-batch-bom_2.12 and sdk-stream-bom_2.12 contains the platform profile to create a fat JAR. You need to upload this fat JAR to the Workspace before you run the application in the cloud.

Note

The platform profile excludes some files and shades some libraries to ensure proper operation when using the JAR from within a Pipeline. In particular, the application.properties file is excluded from the platform profile in the sdk-stream-bom_2.12 file to not override the application.properties file provided by the runtime.

Another important part of the plugin management section and platform profile is maven-shade-plugin which shades protobuf-java 3+. The shading mechanism is required because of a conflict between the version of the protobuf-java library used in Apache Flink and Apache Spark, and the version of this library used by Data SDK libraries and Protobuf layer schemas. The Data SDK uses a newer version 3+, but both stream and batch environments provide protobuf 2+.

You can activate the platform profile as follows: mvn -Pplatform package or mvn --activate-profiles=platform package.

If you cannot include the files as parent into your POM, check the content of the files, copy properties, and plugin management configuration which are relevant for your projects.

Include dependencies

As soon as your project includes a BOM file, the project can reference a Data SDK library or dependency without specifying the actual version of the library.

<dependencies>
  <!-- Reference to pipeline runner library from Data Processing Library -->
  <dependency>
      <groupId>com.here.platform.data.processing</groupId>
      <artifactId>pipeline-runner-java_2.12</artifactId>
  </dependency>
</dependencies>

This approach simplifies migration from one Data SDK version to a newer one. Changing the version of the sdk-batch-bom_2.12 updates the versions of all Data SDK libraries and their dependencies, and the versions of the libraries provided by the batch environment.

This approach simplifies the migration from one Data SDK version to a more recent one. If you change the version of the sdk-batch-bom_2.12 in your project, your project uses updated versions of the following:

  • Data SDK libraries
  • dependencies of all Data SDK libraries
  • libraries provided by the batch environment

When an application which depends on a different version of a package is defined in the BOM, it may see runtime exceptions such as the following:

  • ClassNotFoundException
  • NoSuchMethodError

These exceptions are raised when classes or methods are not found in the library provided in the runtime environment. There are two common cases:

  1. The runtime environment contains an older version of the library, and the application requires a newer one with new methods and classes.
  2. The runtime environment contains a newer version of the library, but the application uses removed deprecated methods or classes from an old version.

If the library is referenced in your project, you can use a corresponding version of the library from the BOM files. However, the version of one of the transitive dependencies may have a conflict with the version provided in the runtime environment. In this case, you need to shade this transitive dependency. For an example of how to shade a library, see the platform profile in sdk-batch-bom_2.12 or sdk-stream-bom_2.12. This profile shades the protobuf-java package as mentioned earlier.

For more information about how to import dependencies and BOMs, see the Maven Dependency Management Guide.

Note

Archetypes
The Workspace-provided batch and streaming archetypes contain BOMs and have the necessary dependencies in their project POM files. Only the Maven build system supports these archetypes.

Pipelines runtime environment libraries

The HERE Workspace provides batch and streaming pipelines to run location data-based applications. Each pipeline job is executed within a specific runtime environment. This environment consists of the following:

  1. Java Runtime Environment (JRE)
  2. Apache Spark or Apache Flink framework

A batch (Spark) or streaming (Flink) environment comes with a list of packages that are loaded by the Java Virtual Machine (JVM) and take precedence over any other application-provided package.

To describe the list of packages that are part of the runtime environment, the Data SDK provides two environment BOMs:

  • Scala 2.12:
    • environment-batch-3.0.0.pom
    • environment-stream-5.0.2.pom

Packages for a runtime environment are marked as provided in the corresponding environment BOM. For example, the Apache Spark package spark-core is marked as provided in environment-batch-3.0.0.pom:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_${scala.compat.version}</artifactId>
  <version>${spark.version}</version>
  <scope>provided</scope>
</dependency>

Notice that sdk-batch-bom_2.12 and sdk-stream-bom_2.12 files include the corresponding environment-batch-3.0.0.pom and environment-stream-5.0.2.pom files. A project that uses sdk-batch-bom_2.12 or sdk-stream-bom_2.12 does not need to explicitly include the corresponding environment POM file. For more information about the libraries provided by the pipeline runtime environments, see the corresponding environment POM files.

Protobuf schemas

The HERE Workspace allows users to define their Protobuf schemas and distribute them in the Workspace. The schemas are stored in a special repository which can be accessed only by means of the HERE Maven Wagon plugin in Maven projects or the unofficial HERE SBT Resolver plugin for SBT projects.

To start configuring and using the plugins, you need HERE platform credentials.

Maven projects

To configure your project for use with the Maven Wagon plugin, follow the steps below:

  1. Include the Maven Wagon plugin in your project using the latest Maven version. For the version number, see the HERE Maven Wagon plugin page.

     <build>
       <extensions>
         <extension>
           <groupId>com.here.platform.artifact</groupId>
           <artifactId>artifact-wagon</artifactId>
           <version>${artifact.wagon.version}</version>
         </extension>
       </extensions>
     </build>
    
  2. Add the HERE Artifact Service reference into your project. The here+artifact-service://artifact-service placeholder will be replaced by the plugin dynamically based on your credentials.

     <repositories>
     <!-- The reference to the HERE repository with schemas -->
       <repository>
         <id>HERE_PLATFORM_ARTIFACT</id>
         <layout>default</layout>
         <url>here+artifact-service://artifact-service</url>
       </repository>
     </repositories>
    
  3. Add the dependency to the schema in your project, similar to the Administrative Place Profiles example below. Use the values for the schema you need to add.

     <dependencies>
       <dependency>
         <groupId>com.here.schema.rib</groupId>
         <artifactId>administrative-place-profiles_v2_java</artifactId>
         <version>2.88.0</version>
         <type>jar</type>
       </dependency>
     </dependencies>
    

    To find the code snippet for your schema, go to the HERE portal. Find a shared schema you want to include and open the Artifacts tab.

For more information on the configuration of the Maven Wagon plugin, see HERE platform Maven Wagon plugin.

SBT projects

To configure your project for use with the SBT Resolver plugin, follow the steps below:

  1. Register the sbt-resolver plugin by adding the following entry to the projects/plugins.sbt file:

     addSbtPlugin("com.here.platform.artifact" % "sbt-resolver" % sbtResolverVersion)
    
  2. Add the HERE Artifact Service reference into your project. The here+artifact-service://artifact-service placeholder will be replaced by the plugin dynamically based on your credentials:

     resolvers += "HERE_PLATFORM_ARTIFACT" at "here+artifact-service://artifact-service"
    
  3. Add the dependency to the schema in your project by including the following declaration in the build.sbt file, similar to the Administrative Place Profiles example below. Use the values for the schema you need to add.

     libraryDependencies += "com.here.schema.rib" %% "administrative-place-profiles_v2_scala"% "2.88.0"
    

    To find the details of the schema, go to the HERE portal. Find a shared schema you want to include and open the Artifacts tab.

For more information on the configuration of the SBT Resolver plugin, see the HERE SBT Resolver plugin page.

For more information on how to create and publish a new schema, see Create and Extend Schemas.

Scala 2.11 to Scala 2.12 migration guide

This section describes high-level changes that have to be made to migrate your app from Scala 2.11 to Scala 2.12. Scala 2.12 support was introduced as part of the HERE Data SDK for Java & Scala 2.28 release.

  1. New SDK BOM files with the _2.12 suffix were added for each environment. To migrate to the Scala 2.12, update SDK BOM artifactId's to include _2.12 suffix as follows:

  2. Change your app's dependencies with the _2.11 suffix to use _2.12. It is recommended that you use _${scala.compat.version} as a suffix. For instance,

     <dependency>
       <groupId>org.apache.spark</groupId>
       <artifactId>spark-core_${scala.compat.version}</artifactId>
       <version>${spark.version}</version>
       <scope>provided</scope>
     </dependency>
    

    For convenience, all SDK BOM files declare the ${scala.compat.version} property. To use the property, SDK BOM must be added as a parent to your project.

  3. Schemas maintained by HERE have been updated to support Scala 2.12:

    • HERE Map Content Schema
    • SDII Schema
    • Sensoris Schema
    • Real-Time Traffic Schema
    • Weather Schema
    • Optimized Map for Location Library

    All Scala 2.12 bindings have the _2.12 suffix. To use the above-mentioned schemas, you have to add _2.12 to the schema artifactId.

    For more details on how to include a schema into the Maven or sbt project, refer to the schema page on the platform, see Sensoris schema as an example.

  4. Your own schemas published on the platform must be updated to support Scala 2.12 by adding
    Scala 2.12 bindings to the schema project.

    For more information on migrating schemas to use Scala 2.12, refer to the Schema Migration Guide.

  5. Since the 2.28 release, the HERE platform supports two additional pipeline environments for Scala 2.12. To run Scala 2.12 applications on the platform, use the newest versions:

    If you use the OLP CLI, you must change the <cluster type> parameter in the olp pipeline template create command.

Known issues

  1. If your app uses maven-shade-plugin version 3.1.0 or older, it is a good strategy to upgrade it to the newer version. The recommended way to upgrade is by using the version from the SDK BOM.
  2. Spark's current Dataset API is inoperative for users of Java 8 lambdas or Scala 2.12. In this case, add explicit type casting to MapFunction<> in you Java 8 code that uses Spark Dataset .map() method.

results matching ""

    No results matching ""