Skip to content

Commit

Permalink
Merge pull request #6 from newrelic/configurable-custom-fields
Browse files Browse the repository at this point in the history
feat: Configurable custom fields storing
  • Loading branch information
gsidhwani-nr authored Nov 19, 2024
2 parents 64b523f + 5f06e2e commit 479ec8a
Show file tree
Hide file tree
Showing 11 changed files with 106 additions and 114 deletions.
16 changes: 12 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

# New Relic Log4j2 Appender

A custom Log4j2 appender that sends logs to New Relic.
A custom Log4j2 appender that sends logs to New Relic. This appender supports both plain text log messages and JSON log objects.

## Installation

Expand All @@ -28,7 +28,7 @@ Add the library to your project using Maven Central:
<dependency>
<groupId>com.newrelic.labs</groupId>
<artifactId>custom-log4j2-appender</artifactId>
<version>1.0.2</version>
<version>1.0.3</version>
</dependency>
```

Expand All @@ -38,7 +38,7 @@ Or, if using a locally built JAR file:
<dependency>
<groupId>com.newrelic.labs</groupId>
<artifactId>custom-log4j2-appender</artifactId>
<version>1.0.2</version>
<version>1.0.3</version>
<scope>system</scope>
<systemPath>${project.basedir}/src/main/resources/custom-log4j2-appender.jar</systemPath>
</dependency>
Expand Down Expand Up @@ -68,7 +68,8 @@ Replace `[your-api-key]` with the ingest key obtained from the New Relic platfor
batchSize="5000"
maxMessageSize="1048576"
flushInterval="120000"
customFields="businessGroup=exampleGroup,environment=production">
customFields="businessGroup=exampleGroup,environment=production"
mergeCustomFields="true">
<PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n"/>
</NewRelicBatchingAppender>
</Appenders>
Expand All @@ -93,11 +94,18 @@ Replace `[your-api-key]` with the ingest key obtained from the New Relic platfor
| maxMessageSize | No | 1048576 | Maximum size (in bytes) of the payload to be sent in a single HTTP request |
| flushInterval | No | 120000 | Interval (in milliseconds) at which the log entries are flushed to New Relic|
| customFields | No | | Add extra context to your logs with custom fields, represented as comma-separated name-value pairs.|
| mergeCustomFields | No | false | (Default: false) All custom fields will be available as `custom.field1`, `custom.field2` else `field1` , `field2` will be available as the main attributes |



## Custom Fields [ v1.0.1 + ]
Custom fields provide a way to include additional custom data in your logs. They are represented as comma-separated name-value pairs. This feature allows you to add more context to your logs, making them more meaningful and easier to analyze.

## Configuring Custom Fields as Subfields of Custom Fields [v1.0.3+]
Starting from version 1.0.3, a new configuration parameter `mergeCustomFields` has been added. By default, all custom fields will be available as subfields under the `custom` field (e.g., `custom.field1`, `custom.field2`). If `mergeCustomFields` is set to `true`, custom fields will be available as main attributes (e.g., `field1`, `field2`).



### TLS 1.2 Requirement

New Relic only accepts connections from clients using TLS version 1.2 or greater. Ensure that your execution environment is configured to use TLS 1.2 or greater.
Expand Down
4 changes: 2 additions & 2 deletions custom-log4j2-appender/build-jar.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jar {
'Implementation-Title': 'Custom Log4j2 Appender',
'Implementation-Vendor': 'New Relic Labs',
'Implementation-Vendor-Id': 'com.newrelic.labs',
'Implementation-Version': '1.0.2'
'Implementation-Version': '1.0.3'
)
}
}
Expand Down Expand Up @@ -53,7 +53,7 @@ publishing {

groupId = 'com.newrelic.labs'
artifactId = 'custom-log4j2-appender'
version = '1.0.2'
version = '1.0.3'

pom {
name = 'Custom Log4j2 Appender'
Expand Down
4 changes: 2 additions & 2 deletions custom-log4j2-appender/build-shadowJar.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ shadowJar {
'Implementation-Title': 'Custom Log4j2 Appender',
'Implementation-Vendor': 'New Relic Labs',
'Implementation-Vendor-Id': 'com.newrelic.labs',
'Implementation-Version': '1.0.2'
'Implementation-Version': '1.0.3'
)
}
}
Expand Down Expand Up @@ -55,7 +55,7 @@ publishing {

groupId = 'com.newrelic.labs'
artifactId = 'custom-log4j2-appender'
version = '1.0.2'
version = '1.0.3'

pom {
name = 'Custom Log4j2 Appender'
Expand Down
4 changes: 2 additions & 2 deletions custom-log4j2-appender/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ shadowJar {
'Implementation-Title': 'Custom Log4j2 Appender',
'Implementation-Vendor': 'New Relic Labs',
'Implementation-Vendor-Id': 'com.newrelic.labs',
'Implementation-Version': '1.0.2'
'Implementation-Version': '1.0.3'
)
}
}
Expand Down Expand Up @@ -55,7 +55,7 @@ publishing {

groupId = 'com.newrelic.labs'
artifactId = 'custom-log4j2-appender'
version = '1.0.2'
version = '1.0.3'

pom {
name = 'Custom Log4j2 Appender'
Expand Down

This file was deleted.

2 changes: 1 addition & 1 deletion custom-log4j2-appender/publish-jar-legacy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ cp build-jar.gradle build.gradle
# Set variables
GROUP_ID="com.newrelic.labs"
ARTIFACT_ID="custom-log4j2-appender"
VERSION="1.0.2"
VERSION="1.0.3"
KEY_ID="0ED9FD74E81E6D83FAE25F235640EA0B1C631C6F" # Replace with your actual key ID

# Get the current directory (assuming the script is run from the custom-log4j2-appender directory)
Expand Down
2 changes: 1 addition & 1 deletion custom-log4j2-appender/publish-jar.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ cp build-jar.gradle build.gradle
# Set variables
GROUP_ID="io.github.newrelic-experimental"
ARTIFACT_ID="custom-log4j2-appender"
VERSION="1.0.2"
VERSION="1.0.3"
KEY_ID="0ED9FD74E81E6D83FAE25F235640EA0B1C631C6F" # Replace with your actual key ID

# Get the current directory (assuming the script is run from the custom-log4j2-appender directory)
Expand Down
2 changes: 1 addition & 1 deletion custom-log4j2-appender/publish-shadowJar.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ cp build-shadowJar.gradle build.gradle
# Set variables
GROUP_ID="io.github.newrelic-experimental"
ARTIFACT_ID="custom-log4j2-appender"
VERSION="1.0.2"
VERSION="1.0.3"
KEY_ID="0ED9FD74E81E6D83FAE25F235640EA0B1C631C6F" # Replace with your actual key ID

# Get the current directory (assuming the script is run from the custom-log4j2-appender directory)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,16 @@ public class LogEntry {
private final String name;
private final String logtype;
private final long timestamp;
private final Map<String, Object> custom; // Add custom fields


public LogEntry(String message, String applicationName, String name, String logtype, long timestamp,
Map<String, Object> custom) {
Map<String, Object> custom, boolean mergeCustomFields) {
this.message = message;
this.applicationName = applicationName;
this.name = name;
this.logtype = logtype;
this.timestamp = timestamp;
this.custom = custom; // Initialize custom fields

}

public String getMessage() {
Expand All @@ -39,8 +39,4 @@ public String getLogType() {
public long getTimestamp() {
return timestamp;
}

public Map<String, Object> getcustom() {
return custom;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ public boolean isInitialized() {
return apiKey != null && apiURL != null;
}

public void flush(List<LogEntry> logEntries) {
public void flush(List<LogEntry> logEntries, boolean mergeCustomFields, Map<String, Object> customFields) {
InetAddress localhost = null;
try {
localhost = InetAddress.getLocalHost();
Expand All @@ -59,8 +59,17 @@ public void flush(List<LogEntry> logEntries) {
logEvent.put("source", "NRBatchingAppender");

// Add custom fields
if (entry.getcustom() != null) {
logEvent.put("custom", entry.getcustom());
if (customFields != null) {
if (mergeCustomFields) {
// Traverse all keys and add each field separately
Map<String, Object> customFields1 = customFields;
for (Map.Entry<String, Object> field : customFields1.entrySet()) {
logEvent.put(field.getKey(), field.getValue());
}
} else {
// Directly add the custom fields as a single entry
logEvent.put("custom", customFields);
}
}

logEvents.add(logEvent);
Expand All @@ -70,7 +79,7 @@ public void flush(List<LogEntry> logEntries) {
byte[] compressedPayload = gzipCompress(jsonPayload);

if (compressedPayload.length > maxMessageSize) {
splitAndSendLogs(logEntries);
splitAndSendLogs(logEntries, mergeCustomFields, customFields);
} else {
sendLogs(logEvents);
}
Expand All @@ -79,26 +88,50 @@ public void flush(List<LogEntry> logEntries) {
}
}

private void splitAndSendLogs(List<LogEntry> logEntries) throws IOException {
private void splitAndSendLogs(List<LogEntry> logEntries, boolean mergeCustomFields,
Map<String, Object> customFields) throws IOException {
List<LogEntry> subBatch = new ArrayList<>();
int currentSize = 0;
for (LogEntry entry : logEntries) {
String entryJson = objectMapper.writeValueAsString(entry);
Map<String, Object> logEvent = objectMapper.convertValue(entry, LowercaseKeyMap.class);
logEvent.put("hostname", InetAddress.getLocalHost().getHostName());
logEvent.put("logtype", entry.getLogType());
logEvent.put("timestamp", entry.getTimestamp());
logEvent.put("applicationName", entry.getApplicationName());
logEvent.put("name", entry.getName());
logEvent.put("source", "NRBatchingAppender");

// Add custom fields
if (customFields != null) {
if (mergeCustomFields) {
// Traverse all keys and add each field separately
Map<String, Object> customFields1 = customFields;
for (Map.Entry<String, Object> field : customFields1.entrySet()) {
logEvent.put(field.getKey(), field.getValue());
}
} else {
// Directly add the custom fields as a single entry
logEvent.put("custom", customFields);
}
}

String entryJson = objectMapper.writeValueAsString(logEvent);
int entrySize = gzipCompress(entryJson).length;
if (currentSize + entrySize > maxMessageSize) {
sendLogs(convertToLogEvents(subBatch));
sendLogs(convertToLogEvents(subBatch, mergeCustomFields, customFields));
subBatch.clear();
currentSize = 0;
}
subBatch.add(entry);
currentSize += entrySize;
}
if (!subBatch.isEmpty()) {
sendLogs(convertToLogEvents(subBatch));
sendLogs(convertToLogEvents(subBatch, mergeCustomFields, customFields));
}
}

private List<Map<String, Object>> convertToLogEvents(List<LogEntry> logEntries) {
private List<Map<String, Object>> convertToLogEvents(List<LogEntry> logEntries, boolean mergeCustomFields,
Map<String, Object> customFields) {
List<Map<String, Object>> logEvents = new ArrayList<>();
try {
InetAddress localhost = InetAddress.getLocalHost();
Expand All @@ -114,9 +147,17 @@ private List<Map<String, Object>> convertToLogEvents(List<LogEntry> logEntries)
logEvent.put("source", "NRBatchingAppender");

// Add custom fields
// Add custom fields
if (entry.getcustom() != null) {
logEvent.put("custom", entry.getcustom());
if (customFields != null) {
if (mergeCustomFields) {
// Traverse all keys and add each field separately
Map<String, Object> customFields1 = customFields;
for (Map.Entry<String, Object> field : customFields1.entrySet()) {
logEvent.put(field.getKey(), field.getValue());
}
} else {
// Directly add the custom fields as a single entry
logEvent.put("custom", customFields);
}
}

logEvents.add(logEvent);
Expand Down Expand Up @@ -176,12 +217,12 @@ private byte[] gzipCompress(String input) throws IOException {
return bos.toByteArray();
}

public void close() {
public void close(boolean mergeCustomFields, Map<String, Object> customFields) {
List<LogEntry> remainingLogs = new ArrayList<>();
logQueue.drainTo(remainingLogs);
if (!remainingLogs.isEmpty()) {
System.out.println("Flushing remaining " + remainingLogs.size() + " log events to New Relic...");
flush(remainingLogs);
flush(remainingLogs, mergeCustomFields, customFields);
}
}
}
Loading

0 comments on commit 479ec8a

Please sign in to comment.