ELK Analytics for WSO2 API Manager

Chameera Rupasinghe
6 min readJan 29, 2023

--

Sources: https://twitter.com/wso2apimanager https://www.elastic.co/what-is/elk-stack

ELK (Elasticsearch Logstash Kibana) stack is widely used in visualizing logs. The ELK stack comprises of a search engine (Elasticsearch), logs ingestion tool (Logstash), and the visualization tool (Kibana).

Read more on ELK on www.elastic.co/what-is/elk-stack

WSO2 API Manager is a platform for building, integrating, and exposing digital services as managed APIs in the cloud, on-premises, or hybrid architectures.

Read more on WSO2 API Manager on wso2.com/api-manager/

In WSO2 API Manager there is analytics support for the APIs managed. Let’s see how to connect ELK stack to visualise logs for WSO2 API Manager. For this I’m going to setup all the components in a single machine for simplicity.

Prerequisites

We are going to need the following.

  1. WSO2 API Manager (4.1.0 or later)
  2. Elasticsearch
  3. Filebeats
  4. Logstash
  5. Kibana

Also to build and run WSO2 API Manager we are going to need Java (JDK 11) and Maven (3.6.3 or later).

Let’s get into the action.

Installing and configuring WSO2 API Manager

Installing WSO2 API Manager

I’m going to assume that you already have downloaded and installed the WSO2 API Manager version 4.1.0 in your computer. Let’s call your installed WSO2 API Manager directory as <APIM_HOME>.

You can visit here to find out more about installing WSO2 API Manager.

Configuring WSO2 API Manager

We have the wso2-apim directory extracted from the wso2-apim.zip file. Let’s call this <APIM_HOME>. Now we have to do some configuration changes.

Open the deployment.toml file in <APIM_HOME>/repository/config/ directory. Find the apim.analytics section and change this config like below.

[apim.analytics]
enable = true
type = "elk"

Next open the log4j2.properties file in <APIM_HOME>/repository/config/ directory. Find the appenders section in it and add APIM_METRICS_APPENDER to the list.

appenders = APIM_METRICS_APPENDER, .... (list of other available appenders)
log4j2.properties file will look like this after adding the APIM_METRICS_APPENDER

Now in the same file, after the existing appenders, add below lines.

appender.APIM_METRICS_APPENDER.type = RollingFile
appender.APIM_METRICS_APPENDER.name = APIM_METRICS_APPENDER
appender.APIM_METRICS_APPENDER.fileName = ${sys:carbon.home}/repository/logs/apim_metrics.log
appender.APIM_METRICS_APPENDER.filePattern = ${sys:carbon.home}/repository/logs/apim_metrics-%d{MM-dd-yyyy}-%i.log
appender.APIM_METRICS_APPENDER.layout.type = PatternLayout
appender.APIM_METRICS_APPENDER.layout.pattern = %d{HH:mm:ss,SSS} [%X{ip}-%X{host}] [%t] %5p %c{1} %m%n
appender.APIM_METRICS_APPENDER.policies.type = Policies
appender.APIM_METRICS_APPENDER.policies.time.type = TimeBasedTriggeringPolicy
appender.APIM_METRICS_APPENDER.policies.time.interval = 1
appender.APIM_METRICS_APPENDER.policies.time.modulate = true
appender.APIM_METRICS_APPENDER.policies.size.type = SizeBasedTriggeringPolicy
appender.APIM_METRICS_APPENDER.policies.size.size=1000MB
appender.APIM_METRICS_APPENDER.strategy.type = DefaultRolloverStrategy
appender.APIM_METRICS_APPENDER.strategy.max = 10

Then add a reporter to the loggers list in the same file.

loggers = reporter, ...(list of other available loggers)

Finally, to complete configuring WSO2 API Manager, add the following lines after the existing loggers in the same file.

logger.reporter.name = org.wso2.am.analytics.publisher.reporter.elk
logger.reporter.level = INFO
logger.reporter.additivity = false
logger.reporter.appenderRef.APIM_METRICS_APPENDER.ref = APIM_METRICS_APPENDER

Now the configurations in the WSO2 API Manager side is done. Let’s install and configure the ELK stack.

Installing and configuring the ELK stack

Here we are going to install and configure Elasticsearch, Filebeat, Logstash, and Kibana.

Installing Elasticsearch

  • Install Elasticsearch according to your operating system as mentioned here.
  • To start the Elasticsearch server, execute the command below from the Elasticsearch directory using terminal.
% ./bin/elasticsearch
  • In the first ever start of the Elasticsearch server, the default security settings will be displayed. Note them down for future use.
Elasticsearch security settings shown in the first server startup

Installing and configuring Kibana

  • Visit here to install Kibana according to your operating system.
  • Navigate to Kibana installed directory and execute the command using the terminal to start the Kibana server.
% ./bin/kibana
  • Log into Kibana dashboard using the username and password given in the Elasticsearch server start. The default local address for Kibana is localhost:5601.
  • Navigate to Stack Management > Index Management. If you already see the below indices, delete them and then import the indices as mentioned next.
apim_event*
apim_event_faulty
apim_event_response
  • Download the index artifact from here. The export.ndjson file will be downloaded.
  • Navigate to Stack Management > Saved Objects and click Import. Select the downloaded export.ndjson file.
Import the artifact.
  • Now the setup for Kibana is done.

Installing and configuring Filebeat

  • Install Filebeat according to your operating system as mentioned here.
  • Now we have to configure Filebeat to read the log file created by WSO2 API Manager in <APIM_HOME>/repository/logs directory.
  • To configure navigate to the Filebeat directory you have installed Filebeat in and open the filebeat.yml file.
  • Under the filebeat.inputs section, add the following lines.
-   type: log
enabled: true
paths:
- <APIM_HOME>\repository\logs\apim_metrics.log
include_lines: ['(apimMetrics):']

Replace the <APIM_HOME> part with the actual path to the WSO2 API Manager directory.

  • Under the outputs section add the following lines. Make sure to remove any other outputs if there are any.
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
  • Use the command below to start Filebeat using the terminal.
% sudo ./filebeat -e --strict.perms=false

Installing and configuring Logstash

  • Visit here to install Logstash according to your operating system.
  • Navigate to installed Logstash directory and edit the logstash-sample.conf file inside the config directory with below content.
input {
beats {
port => 5044
}
}

filter {
grok {match => ["message", "%{GREEDYDATA:UNWANTED}\ apimMetrics:%{GREEDYDATA:apimMetrics}\, %{GREEDYDATA:UNWANTED} \:%{GREEDYDATA:properties}"]}
json {source => "properties"}
}
output {
if[apimMetrics] == " apim:response" {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "apim_event_response"
user => "elastic"
password => "PASSWORD"
ssl_certificate_verification => false
}
} else if[apimMetrics] == " apim:faulty" {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "apim_event_faulty"
user => "elastic"
password => "PASSWORD"
ssl_certificate_verification => false
}
}
}

Add the password from the Elasticsearch security settings in the place of PASSWORD in the configuration above.

  • To start the Logstash server, execute the following command in the terminal.
% ./bin/logstash -f config/logstash-sample.conf

Installation and configuration of all ELK components are done now. Make sure each and every component is up and running. Next let us see how to test this setup by invoking an API using WSO2 API Manager.

Let’s Test the Setup

To start the WSO2 API Manger, navigate to the <APIM_HOME>/bin directory and execute the command below in the terminal.

% sh api-manager.sh

Log into the publisher portal from https://localhost:9443/publisher/. Use the default user name and password for logging in.

Click on REST API and then click on DEPLOY SAMPLE API to create and deploy a sample API.

Create a sample API

Now click on the newly created API and navigate to the Try Out section.

Go to the Try Out section on the API

From there invoke the API using the /menu resource, click on Try Out and then click execute to send a request.

To view the logs generated from this request, go to Kibana and navigate to Discover. You’ll see your request details are recored there.

Logs recorded in Kibana

You can try different filters to filter the logs and use different dashboards for better visualisation.

Using dashboards

--

--

Chameera Rupasinghe

Senior Software Engineer @WSO2 | Computer Science and Engineering