Configuring the Airlock Secure Access Hub reporting bundle

Airlock Gateway and Airlock IAM can be configured to send their logs directly to an external Elasticsearch and Kibana (EK) stack. This article describes how to move reporting off the appliance by using the external reporting bundle to deploy and⁠/⁠or configure an EK stack for Airlock Gateway logs and⁠/⁠or Airlock IAM logs. On the appliance itself, Airlock Gateway provides a local reporting system based on EK that runs on the same host as the web application firewall, while Airlock IAM can expose logging and reporting information through log files, by appending log messages to stdout or by forwarding log messages to an external EK instance.

The Airlock Secure Access Hub reporting bundle configures an external EK instance so that logs can be stored and visualized. We recommend the reporting bundle for the following scenarios:

  • Reporting should be centralized to merge logs from multiple instances of Airlock Gateway.
  • Reporting should be centralized to correlate logs from Airlock Gateway and Airlock IAM.
  • Logs from different sources should be integrated into an existing EK infrastructure.
  • Reporting must be offloaded to dedicated hosts to improve availability, performance, and disk-space scalability.

What the reporting bundle provides

The reporting bundle can be used with preconfigured EK instances. It provides the following content.

  • Elasticsearch content: index templates/mappings and ingest pipelines
  • Kibana content: saved objects (index patterns, searches, visualizations, dashboards)

All other aspects of configuring the EK stack (e.g. high availability, authentication) are outside the scope of the reporting bundle.

Configuration

Version compatibility

Use the bundle version that matches your Elasticsearch version. Mismatched versions are not supported. For details about version dependencies, refer to the relevant download page on Techzone.

Configuring Airlock Gateway to use external Elasticsearch

In the Configuration Center, go to:
Log & Report >> Settings

For details, refer to the relevant subsection of the Configuration Center section.

Configuring an existing EK deployment

Prerequisites: a Unix shell and cURL.

  1. Download and unzip the reporting bundle from the relevant download page on Techzone.
  2. Change into the directory of the unzipped reporting bundle.
  3. Optional: By default, the scripts set up all the necessary components for both Airlock Gateway and Airlock IAM. You can omit configuring IAM-specific content by uncommenting the #export NO_IAM=1 line.
  4. Apply the Elasticsearch configuration by running the following command:
  5.  
    Terminal box
    ./setup-ek/elasticsearch/setup-elasticsearch.sh http://elasticsearch:9200
  6. Apply the Kibana configuration by running the following command:
  7.  
    Terminal box
    ./setup-ek/kibana/setup-kibana.sh http://kibana:5601

Migrating data from the internal to an external Elasticsearch

 
Risk

Elastic generally recommends connecting clusters (e.g., cross-cluster search/replication) instead of copying data.

  • Do not interconnect Airlock Gateway’s internal Elasticsearch with an external cluster. For security reasons this approach is not supported and must be avoided.
  • Use the Airlock Gateway CLI tools to export data from the internal Elasticsearch and import it into the external Elasticsearch.
  1. In order for Airlock Gateway to reach the external Elasticsearch instance, configure Remote Reporting as described in the relevant documentation section.
  2. In the Airlock Gateway admin menu, export the data stored internally with the airlock-elasticsearch-query command using the internal Elasticsearch instance as data source.
  3. Import the exported data to the external Elasticsearch instance with the airlock-elasticsearch-import command using the external Elasticsearch instance configured in step 1 as data destination.

Note: Sequential dump/restore operations in Elasticsearch are comparatively slow. The overall duration depends on data size, hardware, and network bandwidth. Plan accordingly and monitor progress; splitting the export/import into smaller batches can help manage long-running jobs.

Deploying an EK test setup

 
Notice

The following solution is for testing only.

  • Production deployments must ensure high availability of the reporting service and implement log data archiving (e.g., audit logs).
  1. Download and unzip the reporting bundle from the relevant download page on Techzone.
  2. Change into the directory of the unzipped reporting bundle.
  3. Run the following command:
  4.  
    Terminal box
    docker-compose up --build
  5. Upon execution, Docker pulls and starts Elasticsearch and Kibana; the setup then loads Elasticsearch (index templates, ingest pipelines) and imports Kibana saved objects (searches, visualizations, dashboards) for Airlock Gateway and⁠/⁠or Airlock IAM.

The following table summarizes default hosts, ports, and URLs for the test setup and links to the vendor documentation:

Property

Elasticsearch

Kibana

host

localhost or any host of your choosing

localhost or any host of your choosing

port

9200

5601

URL

http://localhost:9200

http://localhost:5601

response

Elasticsearch returns a JSON info document about the deployment.

Kibana starts the web UI.

Further information