You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: pkg/stanza/docs/operators/recombine.md
+16-16Lines changed: 16 additions & 16 deletions
Original file line number
Diff line number
Diff line change
@@ -4,22 +4,22 @@ The `recombine` operator combines consecutive logs into single logs based on sim
4
4
5
5
### Configuration Fields
6
6
7
-
| Field | Default | Description |
8
-
| --- | --- | --- |
9
-
|`id`|`recombine`| A unique identifier for the operator. |
10
-
|`output`| Next in pipeline | The connected operator(s) that will receive all outbound entries. |
11
-
|`on_error`|`send`| The behavior of the operator if it encounters an error. See [on_error](../types/on_error.md). |
12
-
|`is_first_entry`|| An [expression](../types/expression.md) that returns true if the entry being processed is the first entry in a multiline series. |
13
-
|`is_last_entry`|| An [expression](../types/expression.md) that returns true if the entry being processed is the last entry in a multiline series. |
14
-
|`combine_field`| required | The [field](../types/field.md) from all the entries that will be recombined. |
15
-
|`combine_with`|`"\n"`| The string that is put between the combined entries. This can be an empty string as well. When using special characters like `\n`, be sure to enclose the value in double quotes: `"\n"`. |
16
-
|`max_batch_size`| 1000 | The maximum number of consecutive entries that will be combined into a single entry. |
17
-
|`max_unmatched_batch_size`| 100 | The maximum number of consecutive entries that will be combined into a single entry before the match occurs (with `is_first_entry` or `is_last_entry`), e.g. `max_unmatched_batch_size=0` - all entries combined, `max_unmatched_batch_size=1` - all entries uncombined until the match occurs, `max_unmatched_batch_size=100` - entries combined into 100-entry-packages until the match occurs |
18
-
|`overwrite_with`|`newest`| Whether to use the fields from the `oldest` or the `newest` entry for all the fields that are not combined. |
19
-
|`force_flush_period`|`5s`| Flush timeout after which entries will be flushed aborting the wait for their sub parts to be merged with. |
20
-
|`source_identifier`|`$attributes["file.path"]`| The [field](../types/field.md) to separate one source of logs from others when combining them. |
21
-
|`max_sources`| 1000 | The maximum number of unique sources allowed concurrently to be tracked for combining separately. |
22
-
|`max_log_size`| 0 | The maximum bytes size of the combined field. Once the size exceeds the limit, all received entries of the source will be combined and flushed. "0" of max_log_size means no limit. |
7
+
| Field | Default | Description |
8
+
| --- | --- | --- |
9
+
|`id`|`recombine`| A unique identifier for the operator. |
10
+
|`output`| Next in pipeline | The connected operator(s) that will receive all outbound entries. |
11
+
|`on_error`|`send`| The behavior of the operator if it encounters an error. See [on_error](../types/on_error.md). |
12
+
|`is_first_entry`|| An [expression](../types/expression.md) that returns true if the entry being processed is the first entry in a multiline series. |
13
+
|`is_last_entry`|| An [expression](../types/expression.md) that returns true if the entry being processed is the last entry in a multiline series. |
14
+
|`combine_field`| required | The [field](../types/field.md) from all the entries that will be recombined. |
15
+
|`combine_with`|`"\n"`| The string that is put between the combined entries. This can be an empty string as well. When using special characters like `\n`, be sure to enclose the value in double quotes: `"\n"`. |
16
+
|`max_batch_size`| 1000 | The maximum number of consecutive entries that will be combined into a single entry. |
17
+
|`max_unmatched_batch_size`| 100 | The maximum number of consecutive entries that will be combined into a single entry before the match occurs (with `is_first_entry` or `is_last_entry`), e.g. `max_unmatched_batch_size=0` - all entries combined, `max_unmatched_batch_size=1` - all entries uncombined until the match occurs, `max_unmatched_batch_size=100` - entries combined into 100-entry-packages until the match occurs |
18
+
|`overwrite_with`|`newest`| Whether to use the fields from the `oldest` or the `newest` entry for all the fields that are not combined. |
19
+
|`force_flush_period`|`5s`| Flush timeout after which entries will be flushed aborting the wait for their sub parts to be merged with. |
20
+
|`source_identifier`| attributes["log.file.path"]| The [field](../types/field.md) to separate one source of logs from others when combining them. |
21
+
|`max_sources`| 1000 | The maximum number of unique sources allowed concurrently to be tracked for combining separately. |
22
+
|`max_log_size`| 0 | The maximum bytes size of the combined field. Once the size exceeds the limit, all received entries of the source will be combined and flushed. "0" of max_log_size means no limit. |
23
23
24
24
Exactly one of `is_first_entry` and `is_last_entry` must be specified.
0 commit comments