Connect this data source on your own, using the Hunters platform.
TL;DR
Supported data types | 3rd party detection | Hunters detection | IOC search | Search | Table name | Log format | Collection method |
---|---|---|---|---|---|---|---|
Mulesoft Application Logs | mulesoft_application_logs | Syslog | S3 |
Overview
MuleSoft, a Salesforce company, is a leading integration platform that enables businesses to connect applications, data, and devices across cloud and on-premises environments. Its Anypoint Platform provides API management, automation, and connectivity solutions, allowing organizations to streamline workflows and enhance digital transformation. With reusable APIs, MuleSoft simplifies integration, improving scalability and efficiency while enabling seamless data exchange across various systems.
Supported data types
Mulesoft Application Logs
Table name: mulesoft_application_logs
MuleSoft Application Logs are a critical aspect of monitoring, troubleshooting, and ensuring the optimal performance of applications and integrations built on the MuleSoft platform. These logs capture a wide range of information, including details about message transactions, errors and exceptions encountered during processing, performance metrics, and system events related to the application lifecycle.
Send data to Hunters
Hunters supports the ingestion of Mulesoft logs via an intermediary AWS S3 bucket. To connect Mulesoft logs, follow these steps:
Step 1: Export your logs from Mulesoft to an AWS S3 bucket
This depends on your Mulesoft deployment. Follow the steps below, depending on your Mulesoft setup.
MuleSoft logs application events and typically stores them in:
/logs/mule-app.log
(default location)Configured in
log4j2.xml
You can configure MuleSoft to upload logs to S3 using the Amazon S3 Connector:
<flow name="uploadLogsToS3">
<scheduler frequency="60000"/> <!-- Every 60 seconds -->
<file:read path="/logs/mule-app.log"/>
<s3:put-object bucketName="my-mulesoft-logs" key="mule-logs/${date}.log" content="#[payload]"/>
<logger message="Logs uploaded to S3" level="INFO"/>
</flow>
This reads logs from mule-app.log
and uploads them to an S3 bucket named my-mulesoft-logs
.
MuleSoft logs application events based on Log4j2 configuration. The logs are typically stored in:
Available in Runtime Manager under the Logs tab
Can be accessed via the CloudHub API
Step 1: Extract logs from MuleSoft CloudHub using the CloudHub API
Get your Anypoint platform access token.
Fetch logs from CloudHub using the CloudHub API.
Step 2: Trigger an AWS Lambda function to process the logs
Create an IAM Role for Lambda.
Create a Lambda function.
Write the Lambda code to perform the following:
Retrieve an access token from MuleSoft's Anypoint API.
Fetch application logs from MuleSoft CloudHub using the API.
Format the log data and upload it to an AWS S3 bucket.
Run automatically whenever triggered by AWS CloudWatch.
💡Suggested code
See this example code as a suggestion:
import json import requests import boto3 import datetime import os # AWS S3 Configuration S3_BUCKET_NAME = "your-mulesoft-logs-bucket" # MuleSoft API Configuration MULESOFT_CLIENT_ID = "YOUR_CLIENT_ID" MULESOFT_CLIENT_SECRET = "YOUR_CLIENT_SECRET" CLOUDHUB_APP_NAME = "YOUR_APP_NAME" ANYPOINT_URL = "https://anypoint.mulesoft.com" def get_mulesoft_access_token(): """Retrieve an access token from MuleSoft Anypoint.""" url = f"{ANYPOINT_URL}/identity/v2/oauth2/token" payload = { "grant_type": "client_credentials", "client_id": MULESOFT_CLIENT_ID, "client_secret": MULESOFT_CLIENT_SECRET } headers = {"Content-Type": "application/x-www-form-urlencoded"} response = requests.post(url, data=payload, headers=headers) response.raise_for_status() return response.json()["access_token"] def fetch_mulesoft_logs(): """Fetch logs from MuleSoft CloudHub API.""" token = get_mulesoft_access_token() url = f"{ANYPOINT_URL}/cloudhub/api/v2/applications/{CLOUDHUB_APP_NAME}/logs" headers = {"Authorization": f"Bearer {token}"} response = requests.get(url, headers=headers) response.raise_for_status() return response.json()["data"] def upload_logs_to_s3(logs): """Upload logs to S3 as a JSON file.""" s3_client = boto3.client("s3") timestamp = datetime.datetime.utcnow().strftime("%Y-%m-%d_%H-%M-%S") s3_key = f"mule-logs/logs_{timestamp}.json" log_data = json.dumps(logs, indent=2) s3_client.put_object( Bucket=S3_BUCKET_NAME, Key=s3_key, Body=log_data, ContentType="application/json" ) print(f"Logs uploaded to S3: {s3_key}") def lambda_handler(event, context): """Lambda function entry point.""" logs = fetch_mulesoft_logs() upload_logs_to_s3(logs) return {"statusCode": 200, "body": json.dumps("Logs successfully uploaded to S3")}
Deploy the Lambda function.
Step 3: Automate Log Exporting with CloudWatch
Create a CloudWatch event rule to ensure logs are sent automatically.
Step 2: Set up the connection on Hunters
Once the export is completed and the logs are collected to S3, follow the steps in this section.
Expected format
Logs are expected in Syslog format.
Oct 6 09:46:14 abcdeap080 systemd: am-fileabc.service: control process exited, code=exited status=123
Oct 6 09:46:09 ABCDAP1139 kernel: type=1400 audit(12345.557:1234): avc: denied { read } for pid=123 comm="abc-watch-log" name="messages" dev="sda5" ino=123452 scontext=system_u:system_r:abc_de:s0 tcontext=unconfined_a:object_b:var_t:s3 tclass=file permissive=0