Skip to content

Commit 935b34c

Browse files
adchiakevjumba
authored andcommitted
ci: Fixing local integration tests, defaulting to test containers (#2927)
1 parent f4f4894 commit 935b34c

File tree

10 files changed

+121
-17
lines changed

10 files changed

+121
-17
lines changed
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
name: pr-local-integration-tests
2+
# This runs local tests with containerized stubs of online stores. This is the main dev workflow
3+
4+
on:
5+
pull_request_target:
6+
types:
7+
- opened
8+
- synchronize
9+
- labeled
10+
11+
jobs:
12+
integration-test-python-local:
13+
# all jobs MUST have this if check for 'ok-to-test' or 'approved' for security purposes.
14+
if:
15+
(github.event.action == 'labeled' && (github.event.label.name == 'approved' || github.event.label.name == 'lgtm' || github.event.label.name == 'ok-to-test')) ||
16+
(github.event.action != 'labeled' && (contains(github.event.pull_request.labels.*.name, 'ok-to-test') || contains(github.event.pull_request.labels.*.name, 'approved') || contains(github.event.pull_request.labels.*.name, 'lgtm')))
17+
runs-on: ${{ matrix.os }}
18+
strategy:
19+
fail-fast: false
20+
matrix:
21+
python-version: [ "3.8" ]
22+
os: [ ubuntu-latest ]
23+
env:
24+
OS: ${{ matrix.os }}
25+
PYTHON: ${{ matrix.python-version }}
26+
steps:
27+
- uses: actions/checkout@v2
28+
with:
29+
# pull_request_target runs the workflow in the context of the base repo
30+
# as such actions/checkout needs to be explicit configured to retrieve
31+
# code from the PR.
32+
ref: refs/pull/${{ github.event.pull_request.number }}/merge
33+
submodules: recursive
34+
- name: Setup Python
35+
uses: actions/setup-python@v2
36+
id: setup-python
37+
with:
38+
python-version: ${{ matrix.python-version }}
39+
architecture: x64
40+
- name: Upgrade pip version
41+
run: |
42+
pip install --upgrade "pip>=21.3.1,<22.1"
43+
- name: Get pip cache dir
44+
id: pip-cache
45+
run: |
46+
echo "::set-output name=dir::$(pip cache dir)"
47+
- name: pip cache
48+
uses: actions/cache@v2
49+
with:
50+
path: |
51+
${{ steps.pip-cache.outputs.dir }}
52+
/opt/hostedtoolcache/Python
53+
/Users/runner/hostedtoolcache/Python
54+
key: ${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-pip-${{ hashFiles(format('**/py{0}-ci-requirements.txt', env.PYTHON)) }}
55+
restore-keys: |
56+
${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-pip-
57+
- name: Install pip-tools
58+
run: pip install pip-tools
59+
- name: Install dependencies
60+
run: make install-python-ci-dependencies
61+
- name: Test local integration tests
62+
if: ${{ always() }} # this will guarantee that step won't be canceled and resources won't leak
63+
run: make test-python-integration-local

CONTRIBUTING.md

Lines changed: 17 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -133,17 +133,19 @@ make test-python
133133
134134
### Integration Tests
135135
There are two sets of tests you can run:
136-
1. Local integration tests (for faster development)
136+
1. Local integration tests (for faster development, tests file offline store & key online stores)
137137
2. Full integration tests (requires cloud environment setups)
138138

139139
#### Local integration tests
140-
To get local integration tests running, you'll need to have Redis setup:
140+
For this approach of running tests, you'll need to have docker set up locally: [Get Docker](https://docs.docker.com/get-docker/)
141141

142-
Redis
143-
1. Install Redis: [Quickstart](https://redis.io/topics/quickstart)
144-
2. Run `redis-server`
142+
It leverages a file based offline store to test against emulated versions of Datastore, DynamoDB, and Redis, using ephemeral containers.
145143

146-
Now run `make test-python-universal-local`
144+
These tests create new temporary tables / datasets locally only, and they are cleaned up. when the containers are torn down.
145+
146+
```sh
147+
make test-python-integration-local
148+
```
147149

148150
#### Full integration tests
149151
To test across clouds, on top of setting up Redis, you also need GCP / AWS / Snowflake setup.
@@ -166,7 +168,15 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
166168
2. Modify `RedshiftDataSourceCreator` to use your credentials
167169

168170
**Snowflake**
169-
- See https://signup.snowflake.com/
171+
1. See https://signup.snowflake.com/ to setup a trial.
172+
2. Then to run successfully, you'll need some environment variables setup:
173+
```sh
174+
export SNOWFLAKE_CI_DEPLOYMENT='[snowflake_deployment]'
175+
export SNOWFLAKE_CI_USER='[your user]'
176+
export SNOWFLAKE_CI_PASSWORD='[your pw]'
177+
export SNOWFLAKE_CI_ROLE='[your CI role e.g. SYSADMIN]'
178+
export SNOWFLAKE_CI_WAREHOUSE='[your warehouse]'
179+
```
170180

171181
Then run `make test-python-integration`. Note that for Snowflake / GCP / AWS, this will create new temporary tables / datasets.
172182

Makefile

Lines changed: 20 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -68,8 +68,26 @@ test-python:
6868
test-python-integration:
6969
FEAST_USAGE=False IS_TEST=True python -m pytest -n 8 --integration sdk/python/tests
7070

71+
test-python-integration-local:
72+
@(docker info > /dev/null 2>&1 && \
73+
FEAST_USAGE=False \
74+
IS_TEST=True \
75+
FEAST_IS_LOCAL_TEST=True \
76+
FEAST_LOCAL_ONLINE_CONTAINER=True \
77+
python -m pytest -n 8 --integration \
78+
-k "not test_apply_entity_integration and \
79+
not test_apply_feature_view_integration and \
80+
not test_apply_data_source_integration" \
81+
sdk/python/tests \
82+
) || echo "This script uses Docker, and it isn't running - please start the Docker Daemon and try again!";
83+
7184
test-python-integration-container:
72-
FEAST_USAGE=False IS_TEST=True FEAST_LOCAL_ONLINE_CONTAINER=True python -m pytest -n 8 --integration sdk/python/tests
85+
@(docker info > /dev/null 2>&1 && \
86+
FEAST_USAGE=False \
87+
IS_TEST=True \
88+
FEAST_LOCAL_ONLINE_CONTAINER=True \
89+
python -m pytest -n 8 --integration sdk/python/tests \
90+
) || echo "This script uses Docker, and it isn't running - please start the Docker Daemon and try again!";
7391

7492
test-python-universal-contrib:
7593
PYTHONPATH='.' \
@@ -104,14 +122,11 @@ test-python-universal-postgres:
104122
not test_universal_types" \
105123
sdk/python/tests
106124

107-
test-python-universal-local:
108-
FEAST_USAGE=False IS_TEST=True FEAST_IS_LOCAL_TEST=True python -m pytest -n 8 --integration sdk/python/tests
109-
110125
test-python-universal:
111126
FEAST_USAGE=False IS_TEST=True python -m pytest -n 8 --integration sdk/python/tests
112127

113128
test-python-go-server: compile-go-lib
114-
FEAST_USAGE=False IS_TEST=True FEAST_GO_FEATURE_RETRIEVAL=True pytest --integration --goserver sdk/python/tests
129+
FEAST_USAGE=False IS_TEST=True pytest --integration --goserver sdk/python/tests
115130

116131
format-python:
117132
# Sort

sdk/python/tests/conftest.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,10 @@ def pytest_collection_modifyitems(config, items: List[Item]):
110110
items.append(t)
111111

112112
goserver_tests = [t for t in items if "goserver" in t.keywords]
113-
if should_run_goserver:
113+
if not should_run_goserver:
114+
for t in goserver_tests:
115+
items.remove(t)
116+
else:
114117
items.clear()
115118
for t in goserver_tests:
116119
items.append(t)

sdk/python/tests/integration/feature_repos/repo_configuration.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -91,6 +91,7 @@
9191
"sqlite": ({"type": "sqlite"}, None),
9292
}
9393

94+
# Only configure Cloud DWH if running full integration tests
9495
if os.getenv("FEAST_IS_LOCAL_TEST", "False") != "True":
9596
AVAILABLE_OFFLINE_STORES.extend(
9697
[
@@ -141,6 +142,7 @@
141142
}
142143

143144

145+
# Replace online stores with emulated online stores if we're running local integration tests
144146
if os.getenv("FEAST_LOCAL_ONLINE_CONTAINER", "False").lower() == "true":
145147
replacements: Dict[
146148
str, Tuple[Union[str, Dict[str, str]], Optional[Type[OnlineStoreCreator]]]

sdk/python/tests/integration/feature_repos/universal/online_store/datastore.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ def create_online_store(self) -> Dict[str, str]:
2727
self.container.start()
2828
log_string_to_wait_for = r"\[datastore\] Dev App Server is now running"
2929
wait_for_logs(
30-
container=self.container, predicate=log_string_to_wait_for, timeout=5
30+
container=self.container, predicate=log_string_to_wait_for, timeout=10
3131
)
3232
exposed_port = self.container.get_exposed_port("8081")
3333
os.environ[datastore.client.DATASTORE_EMULATOR_HOST] = f"0.0.0.0:{exposed_port}"

sdk/python/tests/integration/feature_repos/universal/online_store/dynamodb.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ def create_online_store(self) -> Dict[str, str]:
2121
"Initializing DynamoDB Local with the following configuration:"
2222
)
2323
wait_for_logs(
24-
container=self.container, predicate=log_string_to_wait_for, timeout=5
24+
container=self.container, predicate=log_string_to_wait_for, timeout=10
2525
)
2626
exposed_port = self.container.get_exposed_port("8000")
2727
return {

sdk/python/tests/integration/feature_repos/universal/online_store/hbase.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ def create_online_store(self) -> Dict[str, str]:
1919
"Initializing Hbase Local with the following configuration:"
2020
)
2121
wait_for_logs(
22-
container=self.container, predicate=log_string_to_wait_for, timeout=5
22+
container=self.container, predicate=log_string_to_wait_for, timeout=10
2323
)
2424
exposed_port = self.container.get_exposed_port("9090")
2525
return {"type": "hbase", "host": "127.0.0.1", "port": exposed_port}

sdk/python/tests/integration/feature_repos/universal/online_store/redis.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ def create_online_store(self) -> Dict[str, str]:
1717
self.container.start()
1818
log_string_to_wait_for = "Ready to accept connections"
1919
wait_for_logs(
20-
container=self.container, predicate=log_string_to_wait_for, timeout=5
20+
container=self.container, predicate=log_string_to_wait_for, timeout=10
2121
)
2222
exposed_port = self.container.get_exposed_port("6379")
2323
return {"type": "redis", "connection_string": f"localhost:{exposed_port},db=0"}

sdk/python/tests/integration/registration/test_registry.py

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -571,7 +571,18 @@ def test_apply_feature_view_integration(test_registry):
571571
@pytest.mark.parametrize(
572572
"test_registry", [lazy_fixture("gcs_registry"), lazy_fixture("s3_registry")],
573573
)
574+
def test_apply_data_source_integration(test_registry: Registry):
575+
run_test_data_source_apply(test_registry)
576+
577+
578+
@pytest.mark.parametrize(
579+
"test_registry", [lazy_fixture("local_registry")],
580+
)
574581
def test_apply_data_source(test_registry: Registry):
582+
run_test_data_source_apply(test_registry)
583+
584+
585+
def run_test_data_source_apply(test_registry: Registry):
575586
# Create Feature Views
576587
batch_source = FileSource(
577588
name="test_source",

0 commit comments

Comments
 (0)