-
Notifications
You must be signed in to change notification settings - Fork 99
Proper support for remove_depends=False #490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Conversation
Coverage reportClick to see where and how coverage changed
This report was generated by python-coverage-comment-action |
||||||||||||||||||||||||
src/dishka/integrations/base.py
Outdated
| # Inject the dependency if it was removed from the signature | ||
| yield name, dep | ||
| elif param.kind is Parameter.POSITIONAL_OR_KEYWORD: | ||
| # Inject the dependency if not provided explicitly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do not think it is a good idea dynamically decide whether to inject dep or not. Each function should have single way how to call: either it is user passing params or dishka. Any conflicts should be solved by user
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are two reasons for this proposal:
- Flexibility & low coupling
Say I decorate a function with a custom inject decorator that preserves the original signature:
@inject
def func(i: int, db: FromDishka[Database]) -> None:
...Then this can be called
- either as
func(1), wheredbis injected using the dishka machinery - or as
func(1, db)as a regular function unaware of dishka.
Without this option either all callers must be dishka-aware (leads to tight coupling) or the function would need to be duplicated with a different signature (leads to duplication).
Another way I think about it is that in this case FromDishka[Database] acts like the default_factory of dataclass/Pydantic fields but more powerful:
- it works with plain functions, don't need to bundle them under ad-hoc classes
- the factory definition is much more versatile
But just like default_factory is not invoked if a dataclass/Pydantic field is explicitly assigned, the same should be possible with a dishka dependency (at least as an option enabled by the user; it's fine to disable it by default).
- Type checking.
The fact that func(1) works and func(1, db) doesn't is confusing for both humans and type checkers alike: mypy gives error: Missing positional argument "db" in call to "func" for the former and no error for the latter.
If disallowing passing dependencies explicitly is a deliberate decision, I guess I miss the point of remove_depends=False.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The idea behind remove_depends was just signature modifications. It was not intended to allow using function in multiple ways. Probably, it could be used for metainformation.
Ok, I got you point and we can consider it if it doesn't affect the performance of case with remove_depends=True.
…ct with ParameterDependencyResolver
src/dishka/integrations/base.py
Outdated
| if not has_param(*args, **kwargs): | ||
| yield name, dep | ||
|
|
||
| @staticmethod |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there any reason to guarantee that "internal" method should be available to be called without class instance?
Please, do not use staticmethods unless you need to guarantee this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I used staticmethods just as a namespace mechanism; changed them to global functions in the latest commit.
|
I want to clarify that I am not planning to merge this if it has negative effect on perfomance with To be more direct: any additional function call within the nested funcions like We had plans to improving perfomance, not slowing down. Please, take it into account. I do not feel like this functionality is critical enough to break this rule about perfomance. Still, we can discuss how that can be implemented in other way if you have ideas. |
|
Can you please clarify what you mean by negative effect on performance? Anecdotally (and completely unscientifically) a function calls incurs less than 20 nanoseconds on my system (macos, Python 3.12): In [1]: dependencies = {"d1": "d1", "d2": "d2"}
In [2]: def f1(dependencies):
...: return {name: dep for name, dep in dependencies.items()}
...:
In [3]: def f2(dependencies):
...: return f1(dependencies)
...:
In [4]: timeit f1(dependencies)
139 ns ± 1.28 ns per loop (mean ± std. dev. of 7 runs, 10,000,000 loops each)
In [5]: timeit f2(dependencies)
152 ns ± 0.346 ns per loop (mean ± std. dev. of 7 runs, 10,000,000 loops each)
In [6]: timeit f1(dependencies)
138 ns ± 0.498 ns per loop (mean ± std. dev. of 7 runs, 10,000,000 loops each)
In [7]: timeit f2(dependencies)
154 ns ± 0.409 ns per loop (mean ± std. dev. of 7 runs, 10,000,000 loops each)I'm not sure if/why such submicrosecond differences would be relevant for a pure Python DI library (typically used by web frameworks and queue systems that involve network I/O orders of magnitude slower than a function call) but happy to remove the extra calls for |
|
I know, that this timing looks not very serious, but comparing to the whole logic of So, the overall idea is not to slow down code unless we find something really good to have. Among our ideas there was code generation for |
|
I updated it so that the only overhead for |
|
| ) | ||
| container = container_getter(args, kwargs) | ||
| async with container(additional_context) as container: | ||
| if isinstance(dependencies, ParameterDependencyResolver): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's have 2 spearate function arguments. One is used to bind, another - to iterate over deps. So we can simplify check with if binder is not None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just to clarify, you mean replace the dependencies: dict[str, DependencyKey] | ParameterDependencyResolver with two parameters dependencies: dict[str, DependencyKey] and resolver: ParameterDependencyResolver in all _get_auto_injected_* functions, and then use only one of the two?
if resolver is None:
# iterate over dependencies
else:
# iterate over resolver - dependencies are ignored There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. I mean this. It might be not clear in terms of encapsulation but easier to check and rewrite with code generation



wrap_injectionaccepts aremove_dependsparameter that currently all it does if True is change the annotations and the signature of the decorated function; it has no impact at runtime. Both the default value and all built in integrations useremove_depends=True; there's no example (even in tests) with False.This PR extends (or fixes, depending where you come from)
wrap_injectionto allow the caller of the decorated function pass explicitly dependencies whenremove_depends=Falseinstead of unconditionally requesting them from the container and injecting them. Explicitly passed dependencies can be passed either by name or positionally if the dependency parameter kind isPOSITIONAL_OR_KEYWORD(POSITIONAL_ONLYdependencies are not currently supported, they could be added separately if needed).Also added thorough tests and fixed a few typing issues in integrations/base.py.