In Python 4.0 (the one after 3.9), and in Python 3.7 now with "from __future__ import annotations", annotations are parsed but their interpolation is delayed. So you can literally have "var: 0 < var < 3" (annotation is not a string!) in your code and make it work as you wish.
If your macro isn't changing execution semantics, it probably doesn't need to be a macro.
This is why (if needs to be a special form if your language is eager, but you want to have short circuit conditionals. If you language is lazy, lots of things DON'T need to be macros.
I came to say this as well. The flexibility of Python’s data model generally means that in a lot of cases where you’d want a macro in another language or costly reflection, you can just dynamically create a function or class (with type()), dynamically change things to be properties (by calling the property decorator as a function), mess with how isinstance and issubclass work if you want interface enforcement, etc.
I do agree macros could reduce boilerplate though, but add the extra complexity that someone has to understand AST transformations instead of just regular old Python type dynamic code.
Stepping back though, whether with macros or metaprogramming in the dynamic data model, the hugest rule of thumb is: don’t do this stuff. You aren’t going to need it. Don’t solve your problem (like arg parsing in the Rust example) with code gen.
Always just write a bit of unmysterious extra boilerplate code that your colleagues of mixed skill will thank you for. That verbose thing that is easier for a wide audience to quickly grok despite uglier code with more boilerplate, that is great software, with mature design decision making. It will live a lot longer and be way easier to incrementally modify than the esoteric code gen approach.
> I do agree macros could reduce boilerplate though, but add the extra complexity that someone has to understand AST transformations instead of just regular old Python type dynamic code.
Not only do you have to understand AST transformations, but you also have mysterious stack traces pointing to non-existent code.
The amount of tooling to make it debuggable and understandable is significant; the Javascript folks have done that lift but that's a large community that was heavily dependent on it.
All that said, I think AST transforms have their uses. I wrote a library[1] for pyrsistent to allow normal mutable usage of pyrsistent's immutable data structures. But I think that works because the interface it presents and the behavior it's modeling is ordinary python.
Unchecked type "annotation" was one of the worst features ever added to a reasonably popular programming language. Python did OK without type declarations. Optional type declarations could help with checking, documentation, and optimization. What went in is the worst of both worlds. Type annotations don't do much, and they might be wrong.
Now this. Ugh. This is abusing an internal feature of one implementation.
It gets a good bit more complicated than this, but when you've got a whole mess of functions to chain together, this lets you do it _reasonably_ sanely and figure out the ordering.
I think it should be possible to be compatible with a type checker like mypy as it could reannotate the decorated function with the types looked up from the namespace, but I haven't tested that.
I did something similar to the ending of this post, where the realization sets in that annotations are just expressions. I wanted to force the method's annotated parameters and return type before entering and exiting the method.