User Guide 101 - Tool/Service/Data Providers

*** Note: This guide is still work in progress. ***

Please refrain from making any significant design/implementation decisions based on the current version of this guide.

Target audience

  • This guide is intended to provide support to the Tool/Service/Data Providers, i.e., the users who want to integrate/provide their tool/service/data, that does not have an associated Identity Provider (IdP), as a part of the EFPF ecosystem.

  • The Tool/Service/Data Providers can be of different (sub-) types (sub-user-roles):

    1. Providers of SaaS (Software as a Service): Providers of Tools/Services centrally hosted on cloud, e.g., ROAM Tool, Data Spine NiFi, Data Spine RabbitMQ, EFS, Gmail, GitHub, Skype, Docker Hub, Microsoft Office 365, etc., which generally can / intend to follow the subscription-based or access-based (pay-as-you-go) pricing/licensing model. This includes the tools/services that make processed / value added data available to other users in the EFPF ecosystem, e.g., by publishing it to the Data Spine Message Bus (RabbitMQ), or by making it available over an HTTP/REST API.

    2. Providers of SaaP (Software as a Product): Providers of Tools that are sold / given to users as products and therefore, they generally can / intend to follow a one-time pricing/licensing model. E.g., Factory connectors or IoT Gateways such as TSMatch, Symphony HAL, Industreweb Collect; Google Chrome browser; Microsoft Office 2010, etc.

    3. Data Providers/Publishers: Providers of data through synchronous (request-response) or asynchronous (Pub/Sub) APIs. E.g., OpenWeatherMap HTTP API, EFPF Open Datasets, MQTT API provided by the Industreweb Collect Factory Connector installed at Lagrama’s factory, etc. In general, in the context of the EFPF ecosystem, Data Providers/Publishers are the users who install the EFPF SaaP tools/services (such as Factory connectors or IoT Gateways) at factory premises to collect sensor data from the shop floor and make the it available to the other users in the EFPF ecosystem, e.g., by publishing it to the Data Spine RabbitMQ, or by making it available over an HTTP/REST API.

  • If you want to integrate a platform or one/more tools/services that already have an associated Identity Provider, then refer to the User Guide 101 for Platform Providers instead.

EFPF Ecosystem Deployment Diagram

EFPF Ecosystem Deployment

  • As illustrated in the deployment diagram, the central, core services called ‘Ecosystem Enablers’ are hosted centrally on the EFPF servers.
  • The rest of the tools, services and platforms are self-hosted by the respective providers on their servers, and they connect to the EFPF ecosystem through the Ecosystem Enablers.

Steps

Prerequisite: Setting up your EFPF User Account

  • To perform/test out many of the following integration steps, an EFPF user account is necessary.
  • You can find the instructions for setting up your EFPF user account here.

The steps for the provision and integration of a tool/service depend upon the sub-user-roles:

1. SaaS Providers

  1. Deployment
  • Deploy your tool/service on the servers managed by you
  1. Single sign-on (SSO) - Authentication and authorization for request-response APIs (if applicable)
  • Your tool’s interfaces (GUI/APIs) must be accessible using EFPF user accounts
  • If your tool offers a GUI, add functionality to it to redirect to the EFS login page (if the user isn’t logged in already) for authentication
  • This will also require registration of a new client for your tool in the EFS Keycloak. To do that, either use the Client Registration API of Keycloak or send an email to the EFPF Support Team.
  • Configure your tool’s GUI to use the OAuth2.0 Authorization Code Grant Type/Flow to request an Access Token from the EFS Keycloak
  • Use this access/bearer token to access the tool’s API
  • If your tool offers only an API and no GUI, you can assume that the user who wants to access the API is already in possession of an access token
  • Add functionality to your tool to perform authentication for incoming API access requests. This can be done in various ways:
    1. using a proxy route/endpoint in a local instance of Apache APISIX,
    2. using a Policy Enforcement Point (PEP) embedded into the tool,
    3. using an external library integrated into the tool that takes care of authentication,
    4. using a locally deployed proxy microservice that performs authentication (e.g., oauth2-proxy), etc.
  • EFS Keycloak’s public key can be retrieved from https://<Prod environment EFS Keycloak URL>/auth/realms/<realm name> (realm name would be either ‘efpf’ or ‘master’). Prod environment EFS URL can be obtained from the Connection Details page.
  • Add functionality to your tool to perform authorization for incoming API access requests
  • Details: EFS Documentation
  1. Authentication and authorization for Pub/Sub APIs (if applicable)
  • Use the Pub/Sub Security Service dashboard from the EFPF Portal to get credentials for the Data Spine Message Bus (DS RabbitMQ) and to get permissions to publish/subscribe to topics/queues in DS RabbitMQ
  • Configure your tool to publish/subscribe to DS RabbitMQ
  • The users who want to subscribe to your topics can make use of the Pub/Sub Security Service dashboard to ask for access, and you can see and approve/reject the access requests using the dashboard
  • Details: Pub/Sub Security Service documentation – link to be added
  1. Service/API registration
  1. Data enrichment (if needed)
  • If you want to offer enriched data or data conforming to some other data model (e.g., making OGC SensorThings compliant data available over another API in addition to the proprietary data model served by your tool/service’s API) to the potential service/data consumers, you can use the Data Spine Integration Flow Engine (DS NiFi)
  • Details: User Guide 101 for Composite Application Developer

2. SaaP Providers

  1. Tool Provision
  • Make the tool artifacts such as the binaries or source code downloadable to authenticated and authorized users, or publicly (e.g., if free and/or open source)
  • Publish the documentation for the tool such as admin, developer and user guides, including the API specifications
  1. Advertise

3. Data Providers/Publishers

  1. Search for a tool (if needed)
  • Search for a tool (e.g., a Factory Connector / an IoT Gateway) on the EFPF Portal/Marketplace
  1. Get the tool (if needed)
  • Purchase/download the tool using the link from the EFPF Portal/Marketplace
  1. Deployment
  • Deploy your tool on the servers managed by you (e.g., on factory premises)
  • You can follow the admin guide of the tool from the EFPF Documentation Portal for deployment, initial setup and administration
  1. Collect data
  • Connect your tool to data sources in order to collect data. E.g., connect to sensors to collect shop floor data.
  1. Authentication and authorization for request-response APIs (if applicable)
  • If your tool makes the data available over using an HTTP API, make the API accessible using EFPF user accounts
  • Add functionality to your tool to perform authentication for incoming API access requests. This can be done in various ways:
    1. using a proxy route/endpoint in a local instance of Apache APISIX,
    2. using a Policy Enforcement Point (PEP) embedded into the tool,
    3. using an external library integrated into the tool that takes care of authentication,
    4. using a locally deployed proxy microservice that performs authentication (e.g., oauth2-proxy), etc.
  • EFS Keycloak’s public key can be retrieved from https://<Prod environment EFS Keycloak URL>/auth/realms/<realm name> (– to be updated – the realm name would be either ‘efpf’ or ‘master’). Prod environment EFS URL can be obtained from the Connection Details page.
  • Add functionality to your tool to perform authorization for incoming API access requests
  1. Authentication and authorization for Pub/Sub APIs (if applicable)
  • Use the Pub/Sub Security Service dashboard from the EFPF Portal to get credentials for the Data Spine Message Bus (DS RabbitMQ) and to get permissions to publish/subscribe to topics/queues in DS RabbitMQ
  • Configure your tool to publish/subscribe to DS RabbitMQ
  • The users who want to subscribe to your topics can make use of the Pub/Sub Security Service dashboard to ask for access, and you can see and approve/reject the access requests using the dashboard
  • Details: Pub/Sub Security Service documentation – link to be added
  1. Service/API registration
  1. Data enrichment (if needed)
  • If you want to offer enriched data or data conforming to some other data model (e.g., making OGC SensorThings compliant data available over another API in addition to the proprietary data model served by your tool/service’s API) to the potential service/data consumers, you can use the Data Spine Integration Flow Engine (DS NiFi)
  • Details: User Guide 101 for Composite Application Developer

EFPF Ecosystem Documentation