Skip to content

Commit 604ab81

Browse files
authored
Add null and equals filter types (#63)
* feat: update CLAUDE.md to match the current implementation * feat: add support for null and equals query types * ci: linux test image should use swift 6.2
1 parent 92c232f commit 604ab81

File tree

5 files changed

+355
-34
lines changed

5 files changed

+355
-34
lines changed

.github/workflows/testlinux.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ on:
99
jobs:
1010
build:
1111
runs-on: ubuntu-latest
12-
container: swift:5.9-jammy
12+
container: swift:6.2
1313

1414
steps:
1515
- name: Check out Source

CLAUDE.md

Lines changed: 34 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -42,43 +42,50 @@ swift package generate-xcodeproj
4242

4343
## Architecture Overview
4444

45-
### Core Data Models
46-
- **DTOv1/DTOv2**: Main data transfer objects with versioning
47-
- `DTOv1`: Legacy models (InsightGroup, LexiconPayloadKey, OrganizationJoinRequest)
48-
- `DTOv2`: Current models (Organization, User, App, Insight, Badge, etc.)
49-
- **Models.swift**: Additional DTOs for API requests, authentication, and UI state
50-
51-
### Query System
45+
### Query System (`Query/`)
5246
- **CustomQuery**: Main query builder for Apache Druid integration
53-
- Supports multiple query types: timeseries, groupBy, topN, scan, timeBoundary, funnel, experiment
47+
- Query types: `timeseries`, `groupBy`, `topN`, `scan`, `timeBoundary`, `funnel`, `experiment`, `retention`
5448
- Handles filters, aggregations, post-aggregations, and time intervals
5549
- **Query Components**:
56-
- `Aggregator`: Define aggregation functions (sum, count, etc.)
50+
- `Aggregator`: Aggregation functions (sum, count, etc.)
5751
- `Filter`: Query filtering logic
5852
- `DimensionSpec`: Dimension specifications for grouping
5953
- `QueryGranularity`: Time granularity (day, week, month)
6054
- `VirtualColumn`: Computed columns
61-
62-
### Druid Integration
63-
- **Druid/**: Complete Apache Druid configuration DTOs
64-
- `configuration/`: Tuning configs, compaction configs
65-
- `data/input/`: Input formats, sources, and dimension specs
66-
- `indexing/`: Parallel indexing, batch processing
67-
- `ingestion/`: Native batch ingestion specs
68-
- `segment/`: Data schema and transformation specs
69-
- `Supervisor/`: Kafka streaming supervision
70-
71-
### Chart Configuration
72-
- **ChartConfiguration**: Display settings for analytics charts
73-
- **ChartDefinitionDTO**: Chart metadata and configuration
74-
- **InsightDisplayMode**: Chart types (lineChart, barChart, pieChart, etc.)
75-
76-
### Query Results
77-
- **QueryResult**: Polymorphic result handling for different query types
55+
- `PostAggregator`: Post-aggregation calculations
56+
- `Datasource`: Data source configuration
57+
58+
### Query Generation (`QueryGeneration/`)
59+
- **CustomQuery+Funnel**: Funnel analysis query generation
60+
- **CustomQuery+Experiment**: A/B experiment queries
61+
- **CustomQuery+Retention**: Retention analysis queries
62+
- **Precompilable**: Query precompilation protocol
63+
- **SQLQueryConversion**: SQL conversion utilities
64+
65+
### Query Results (`QueryResult/`)
66+
- **QueryResult**: Polymorphic enum for different result types
7867
- **TimeSeriesQueryResult**: Time-based query results
7968
- **TopNQueryResult**: Top-N dimension results
8069
- **GroupByQueryResult**: Grouped aggregation results
8170
- **ScanQueryResult**: Raw data scanning results
71+
- **TimeBoundaryResult**: Time boundary query results
72+
- Helper types: `StringWrapper`, `DoubleWrapper`, `DoublePlusInfinity`
73+
74+
### Druid Configuration (`Druid/`)
75+
- `configuration/`: TuningConfig, AutoCompactionConfig
76+
- `data/input/`: Input formats and dimension specs
77+
- `indexer/`: Granularity specs
78+
- `indexing/`: Kinesis streaming, parallel batch indexing
79+
- `ingestion/`: Task specs, native batch, ingestion specs
80+
- `segment/`: Data schema and transform specs
81+
82+
### Supervisor (`Supervisor/`)
83+
- Kafka/Kinesis streaming supervision DTOs
84+
85+
### Chart Configuration (`Chart Configuration/`)
86+
- **ChartConfiguration**: Display settings for analytics charts
87+
- **ChartAggregationConfiguration**: Aggregation configuration
88+
- **ChartConfigurationOptions**: Chart options
8289

8390
## Key Dependencies
8491

@@ -87,9 +94,6 @@ swift package generate-xcodeproj
8794

8895
## Development Notes
8996

90-
### DTO Versioning
91-
The library uses a versioning strategy with `DTOv1` and `DTOv2` namespaces. `DTOv2.Insight` is deprecated in favor of V3InsightsController patterns.
92-
9397
### Query Hashing
9498
CustomQuery implements stable hashing using SHA256 for caching and query deduplication. The `stableHashValue` property provides consistent query identification.
9599

@@ -98,7 +102,7 @@ Tests are organized by functionality:
98102
- **DataTransferObjectsTests**: Basic DTO serialization/deserialization
99103
- **QueryTests**: Query building and validation
100104
- **QueryResultTests**: Result parsing and handling
101-
- **QueryGenerationTests**: Advanced query generation (funnels, experiments)
105+
- **QueryGenerationTests**: Advanced query generation (funnels, experiments, retention)
102106
- **SupervisorTests**: Druid supervisor configuration
103107
- **DataSchemaTests**: Data ingestion schema validation
104108

Sources/DataTransferObjects/Query/CustomQuery+CompileDown.swift

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -158,6 +158,10 @@ public extension CustomQuery {
158158
return filter
159159
case .range:
160160
return filter
161+
case .equals:
162+
return filter
163+
case .null:
164+
return filter
161165
case .and(let filterExpression):
162166
return Filter.and(.init(fields: filterExpression.fields.map { compileRelativeFilterInterval(filter: $0) }))
163167
case .or(let filterExpression):

Sources/DataTransferObjects/Query/Filter.swift

Lines changed: 108 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -129,8 +129,97 @@ public struct FilterNot: Codable, Hashable, Equatable, Sendable {
129129
public let field: Filter
130130
}
131131

132+
/// The equality filter matches rows where a column value equals a specific value.
133+
public struct FilterEquals: Codable, Hashable, Equatable, Sendable {
134+
public init(column: String, matchValueType: MatchValueType, matchValue: MatchValue) {
135+
self.column = column
136+
self.matchValueType = matchValueType
137+
self.matchValue = matchValue
138+
}
139+
140+
public enum MatchValueType: String, Codable, Hashable, Equatable, Sendable {
141+
case string = "STRING"
142+
case long = "LONG"
143+
case double = "DOUBLE"
144+
case float = "FLOAT"
145+
case arrayString = "ARRAY<STRING>"
146+
case arrayLong = "ARRAY<LONG>"
147+
case arrayDouble = "ARRAY<DOUBLE>"
148+
case arrayFloat = "ARRAY<FLOAT>"
149+
}
150+
151+
public enum MatchValue: Hashable, Equatable, Sendable {
152+
case string(String)
153+
case int(Int)
154+
case double(Double)
155+
case arrayString([String])
156+
case arrayInt([Int])
157+
case arrayDouble([Double])
158+
}
159+
160+
public let column: String
161+
public let matchValueType: MatchValueType
162+
public let matchValue: MatchValue
163+
}
164+
165+
extension FilterEquals.MatchValue: Codable {
166+
public init(from decoder: Decoder) throws {
167+
let container = try decoder.singleValueContainer()
168+
169+
if let arrayString = try? container.decode([String].self) {
170+
self = .arrayString(arrayString)
171+
} else if let arrayInt = try? container.decode([Int].self) {
172+
self = .arrayInt(arrayInt)
173+
} else if let arrayDouble = try? container.decode([Double].self) {
174+
self = .arrayDouble(arrayDouble)
175+
} else if let string = try? container.decode(String.self) {
176+
self = .string(string)
177+
} else if let int = try? container.decode(Int.self) {
178+
self = .int(int)
179+
} else if let double = try? container.decode(Double.self) {
180+
self = .double(double)
181+
} else {
182+
throw DecodingError.typeMismatch(
183+
FilterEquals.MatchValue.self,
184+
DecodingError.Context(
185+
codingPath: decoder.codingPath,
186+
debugDescription: "Expected String, Int, Double, or array types"
187+
)
188+
)
189+
}
190+
}
191+
192+
public func encode(to encoder: Encoder) throws {
193+
var container = encoder.singleValueContainer()
194+
195+
switch self {
196+
case .string(let value):
197+
try container.encode(value)
198+
case .int(let value):
199+
try container.encode(value)
200+
case .double(let value):
201+
try container.encode(value)
202+
case .arrayString(let value):
203+
try container.encode(value)
204+
case .arrayInt(let value):
205+
try container.encode(value)
206+
case .arrayDouble(let value):
207+
try container.encode(value)
208+
}
209+
}
210+
}
211+
212+
/// The null filter matches rows where a column value is null.
213+
public struct FilterNull: Codable, Hashable, Equatable, Sendable {
214+
public init(column: String) {
215+
self.column = column
216+
}
217+
218+
public let column: String
219+
}
220+
132221
/// A filter is a JSON object indicating which rows of data should be included in the computation
133-
/// for a query. Its essentially the equivalent of the WHERE clause in SQL.
222+
/// for a query. It's essentially the equivalent of the WHERE clause in SQL.
134223
public indirect enum Filter: Codable, Hashable, Equatable, Sendable {
135224
/// The selector filter will match a specific dimension with a specific value.
136225
/// Selector filters can be used as the base filters for more complex Boolean
@@ -157,6 +246,12 @@ public indirect enum Filter: Codable, Hashable, Equatable, Sendable {
157246
// to, less than or equal to, and "between"
158247
case range(FilterRange)
159248

249+
/// The equality filter matches rows where a column value equals a specific value.
250+
case equals(FilterEquals)
251+
252+
/// The null filter matches rows where a column value is null.
253+
case null(FilterNull)
254+
160255
// logical expression filters
161256
case and(FilterExpression)
162257
case or(FilterExpression)
@@ -179,14 +274,18 @@ public indirect enum Filter: Codable, Hashable, Equatable, Sendable {
179274
self = try .interval(FilterInterval(from: decoder))
180275
case "regex":
181276
self = try .regex(FilterRegex(from: decoder))
277+
case "range":
278+
self = try .range(FilterRange(from: decoder))
279+
case "equals":
280+
self = try .equals(FilterEquals(from: decoder))
281+
case "null":
282+
self = try .null(FilterNull(from: decoder))
182283
case "and":
183284
self = try .and(FilterExpression(from: decoder))
184285
case "or":
185286
self = try .or(FilterExpression(from: decoder))
186287
case "not":
187288
self = try .not(FilterNot(from: decoder))
188-
case "range":
189-
self = try .range(FilterRange(from: decoder))
190289
default:
191290
throw EncodingError.invalidValue("Invalid type", .init(codingPath: [CodingKeys.type], debugDescription: "Invalid Type", underlyingError: nil))
192291
}
@@ -216,6 +315,12 @@ public indirect enum Filter: Codable, Hashable, Equatable, Sendable {
216315
case let .range(range):
217316
try container.encode("range", forKey: .type)
218317
try range.encode(to: encoder)
318+
case let .equals(equals):
319+
try container.encode("equals", forKey: .type)
320+
try equals.encode(to: encoder)
321+
case let .null(null):
322+
try container.encode("null", forKey: .type)
323+
try null.encode(to: encoder)
219324
case let .and(and):
220325
try container.encode("and", forKey: .type)
221326
try and.encode(to: encoder)

0 commit comments

Comments
 (0)