The terms "macro" and "metaprogramming" already have well-defined meanings in the C++ community. You're just hijacking the terms and using them with some new definition.
The C++ standard is consistent in how it uses "macro," and it's strictly about the preprocessor kind of macros. "Metaprogramming" is also used in the standard, though not as extensively or rigidly (only 8 times in N4950), but both in the specification and colloquially it's more than just slapping consteval on a function.
While I agree regarding the macro term, consteval very much count as metaprogramming in C++.
There is no reason to encode value computations using template expansion anymore. Even a lot of type level computation can be done via plain function calls and decltype. "Old school" recursive template installation is really only needed when you need specifically the capability to pattern match on types of template specialization.
In general, a macro produces code at compile time, doesn't matter in which form it ends up in the binary, as long as the observable side effects are the same.
As mentioned the ongoing C++26 proposal with produce the desired source code at compile time, thus reducing the amount of code of libraries such that one.
> In general, a macro produces code at compile time
The term "macro" is overloaded. In a typical Lisp they are generating source as data, not just constant values. Lisp macros also work in interpreters and they can use arbitrary language features, incl. creating side effects or accessing external data sources.
one fundamental difference being that clisp (i keep writing this as 'clips', talk about lisp !) does not suffer from precision issues, f.e. writing this
Common misconception of non Lispers that macros are equivalent to compile time programming. You’re not simply moving the evaluation to compile time, you’re moving it upwards outside the program space into a higher dimension of programmability.
Not to dog on C++ unfairly, CTE is pretty neat after all.
Funnily enough, PGs “On Lisp” has some really neat macros in it that demonstrate capabilities that just can’t be replicated with template based macros, iirc.
it's "meta" programming, since Lisp macros do source transformations: code to new code. So one writes code which will rewrite code. For example some programming language lacks a control structure one would want. Instead of waiting for the benevolent dictator for life implementing this feature, one can do it by creating a macro, which implements the control structure.
For example, imagine that you want to program with state machines and you need a short notation for that in your programming language. In Lisp you could design a syntax for a state machine and the Lisp macro would transform state machine descriptions into the code used to implement them -> the generated code typically will be longer and full of implementation details -> in the state machine description one would only specify what's necessary. The Lisp macro will do the code transformation from using the new control structure to the implementation of the control structure.
Thus one can view Lisp as a programmable programming language.
Templates are not nearly as powerful as lisp macros and stitching together code is 100 time more complex (bordering on Turing tarpit territory), but almost arbitrary compile time code generation is still possible. See boost.lambda, boost.spirit, eigen or anything using expression templates for example.
Aside the complexity, the main thing C++ currently lacks is code introspection to modify existing code without using an ad-hoc DSL. I think that even the reflection proposals do not go as far as actually introspecting code.
This is probably not the whole picture, and I have a very Rust-centric view of this, but I'll take a stab at it.
The correct analogue for Lisp macros is not C++ templates, but the C preprocessor itself. Specifically, a Lisp macro gets to take a particular section of code and change it as it wishes, with everything already conveniently tokenized for the programmer's convenience. Imagine if you could just write your own C preprocessor as part of your program and have the compiler automatically execute it on specific program areas that want your preprocessing.
Rust macros work similarly to this, the main difference being your syntax needs to be tokenizable as Rust instead of Lisp. But they're also rather powerful. So, for example, in Rust you only have one object system which has structs, traits, and very limited higher-kindedness[0]. But there's plenty of other object systems Rust would like to interop with: Objective-C, Swift, COM, and C++ among others.
The canonical way of doing this in Rust is to write a macro[1] that takes your class definition and converts it into a series of structs, traits, and/or function pointers that suitably interop with the foreign code. Code outside the macro then can reference the class created by the system.
If you don't have macros, your other options are:
- Metaclasses, which are the canonical way in Python of doing foreign object interfaces, though with an added wrinkle: multiple inheritance from classes of different metaclasses requires writing a combined metaclass that does both. In macros you usually just can't mix them like that, though I doubt you'd need to define a single class accessible from, say, both Objective-C and Windows COM.
- Write your own damned preprocessor. This is what Qt did with MOC (metaobject compiler) to get signals and slots[2]. If C++ had macros, Trolltech probably would have written MOC as a macro instead of a separate build step with a separate C++ tokenizer.
[0] A concept which I will not be explaining in this post, but it has to do with things like generic associated types which were needed for lifetime bounds on async traits
[1] Usually a "procedural macro", which is different from the pattern-matching macros Rust usually teaches in ways that don't matter here
Let’s say, you’re writing a web application. In most programming language you’d be using libraries or rely on a framework. With Lisp macros, you can program the archetype of a web application. And then use a simpler language to describe you application. Think of it as programmable snippets. Something like Ultisnips [0], but inherent to the language.
Think how much common code can exist in a software but cannot be refactored to a functions because it will have too many variables. Or the multiple problems with classes tree and overloading. Macros let you solve that.
One example is the ~use-package~ macro (Emacs plugin) [0]. Using packages in emacs is mostly the same code over and over. They've already been abstracted in functions, but you still find yourself juggling with so many utilities. You could write a bigger functions, but it will then have a lot of conditional branches. This macro selectively select the code it needs and transforming it if needs be and then the result will be evaluated.
It's a bit hard to explain for me (English is not my native language). But it's the difference between coding a solution with all the edge cases baked in and coding an archetype that let you add your own cases. With functions, you abstract common algorithms, with macros you abstract common architecture.
I think an example would be helpful, conditionals:
Within the "program dimension" there is just no way to run code conditionally without an if, no matter how much you move left and right, you are constrained. It is only possible by using the "higher dimension".
Lisp macros take zero [0] or more unevaluated forms, does something with them (which may be computing a value, see the factorial example elsewhere in this discussion and the submitted blog), and then returns a new form (which could be that computed value, again see the factorial example) which the Lisp system evaluates eventually.
One way to see this would be to do something like:
The value bound to a inside the macro is the unevaluated form (the first and third items printed out). In this case the macro itself is just the identity macro other than its print effect so it returns the original form, which is why we get 1 and 3 printed as well.
Here's an example of processing (very primitive) the unevaluated form to produce something new:
(defmacro infix (a op b)
`(,op ,a ,b)) ;; alternatively: (list op a b)
(print (infix 10 + 20))
;; output
30
Now we can get fancy (this is primitive still, but works):
This generates a valid (prefix notation) lisp form from a more traditional infix form from mathematics. If I were being more clever I wouldn't use the second item in the list (the operation) directly but restrict it to valid arithmetic operators and change how I walk the structure. That would remove the need to force explicit parentheses since I could add in a proper parsing step. This is a very primitive version of what loop and other complex macros do. They take essentially a different language, parse it, and emit a valid Lisp form which is then evaluated.
You could use this to get constexpr like behavior but once you do that you run into problems, you can't do this for example:
(defmacro foo (a b) (+ a b))
(foo (+ 1 2) (+ 3 4)) ;; error
(let ((a 1) (b 2)) (foo a b)) ;; error
(foo 1 2) ;; => 3
It only partially works because it only works, when a and b are both numbers. If they're symbols (second case) or other forms (first case) then the macro attempts to compute something that cannot be computed. You can fix the first case by doing:
(defmacro foo (a b) (+ (eval a) (eval b))
But that still leaves the second case erroring out. You could do something like what I did with infix which walks the forms and determines if they can be evaluated (no unbound variables) and then evaluate them conditionally, leaving expressions with unbound symbols intact to be processed later.
So C++ constexprs are less than Lisp macros, but if you want to use Lisp macros to do the same thing as constexprs you have to do more work. Check out On Lisp and Let Over Lambda for two books that go deep into macros in Common Lisp.
[0] I honestly don't know why you'd want to do this, but technically it's valid to do:
(defmacro foo () (some form to return))
(macroexpand '(foo))
;; => (some form to return)
I cannot think of a case where this would be useful, but someone else might think of one.
(defmacro factorial (n)
(labels ((fact (m)
(if (= m 0)
1
(* m (fact (1- m))))))
`,(fact n)))
The `, has no use here and can be removed. Here the backquote and the evaluation just returns the computed value.
Thus, this is okay:
(defmacro factorial (n)
(labels ((fact (m)
(if (= m 0)
1
(* m (fact (1- m))))))
(fact n)))
LABELS defines local recursive functions. The macro returns the result of calling FACT, which is a number and which is a valid form in Common Lisp. A number evaluates to itself.
CL-USER > (macroexpand-1 '(factorial 10))
3628800
T
Agreed with the technical content and conclusion. However, I think it is worth pointing out that since C++11, it has had a mechanism to specify (maybe) compile-time computations that are written in plain C++: constexpr, https://en.cppreference.com/w/cpp/language/constexpr
(My parenthetical "maybe" is that I don't think compilers have to compute constexpr expressions at compile time. The compiler will be forced to when such expressions are used in contexts that require values at compile time. But I think it would be permissible for a compile to defer computation of a constexpr to runtime if the value isn't needed until runtime.)
As a code golfer, this is beyond satisfying. Doing things in a few bytes is nice, yes, but doing them in a way no reasonable mortal ever would is even better.
Yes, the purpose of C++ templates (not macros, macros are C preprocessor) is to implement generic classes and functions, not to implement arbitrary computation, and (iirc) it was only discovered after their publication that they are in fact Turing-complete.
C++ templates can also take other C++ templates (the template itself, not just an instantiation), enum values and I think in C++23 maybe even structs but I need to check
C++20 added the ability to take class instances as non-type parameters but the rules around the types of classes that can be used in this way are pretty restrictive (for good reason).
It’s clumsy because C++ templates are an implementation of generic and dependent types, while C++ constexpr and consteval functions are for arbitrary computation. The fact that template metaprogramming _can_ be used for arbitrary computation is actually an unhappy accident of its design that was only proven after its publication.
I hate to agree. For an article about metaprogramming in C++, you'd expect expert level C++ code, and not what's presented here. It's bad. And the first example with the list even has a bug.
I don’t have any experience with Lisp. But I think C++ templates and Rust macros are both super bad and impoverished compared to what can be done in Jai.
I share it not to say “use Jai instead of C++/Rust”. But to instead say “templates and macros suck and it’d be great if Rust or other language copied good ideas into their ecosystem”.
First we need to be able to use Jai, so that those folks can validate in practice how it works, not watching videos about someone using their creation.
The `#modify` thing looks pretty cool, but I can't help but think how a compiler/ide is supposed to analyze the body of such function and provide suggestions. Rust macros have a similar issue, but it's offsetted by the fact that generics are pretty powerful too and can be fully analyzed by the compiler.
Factorial macro example in C++23 metaprogramming,
Exercise for the reader if using VC++ or clang/ninja, use import std instead.-- https://godbolt.org/z/TWe11hM6j
Nicely put 5040 in ESI register at compile time.
Granted, C++ isn't Lisp, but already has quite a room for creativity at compile time, and C++26 might finally have compile time reflection as well.
The terms "macro" and "metaprogramming" already have well-defined meanings in the C++ community. You're just hijacking the terms and using them with some new definition.
The C++ standard is consistent in how it uses "macro," and it's strictly about the preprocessor kind of macros. "Metaprogramming" is also used in the standard, though not as extensively or rigidly (only 8 times in N4950), but both in the specification and colloquially it's more than just slapping consteval on a function.
While I agree regarding the macro term, consteval very much count as metaprogramming in C++.
There is no reason to encode value computations using template expansion anymore. Even a lot of type level computation can be done via plain function calls and decltype. "Old school" recursive template installation is really only needed when you need specifically the capability to pattern match on types of template specialization.
Yeah, because using preprocessor macros would do the trick. /s
macro programming in Lisp would be "programming programs".
I don't see it here. This looks like compile-time execution to compute values. If it would be a macro, it could return source code.
In general, a macro produces code at compile time, doesn't matter in which form it ends up in the binary, as long as the observable side effects are the same.
Example of a library that generates serialization code, https://github.com/getml/reflect-cpp
As mentioned the ongoing C++26 proposal with produce the desired source code at compile time, thus reducing the amount of code of libraries such that one.
> In general, a macro produces code at compile time
The term "macro" is overloaded. In a typical Lisp they are generating source as data, not just constant values. Lisp macros also work in interpreters and they can use arbitrary language features, incl. creating side effects or accessing external data sources.
Some people are calling this "multistage programming" [1] which is kind of related but not exactly the same as a macro system.
--
1: https://en.wikipedia.org/wiki/Multi-stage_programming
That is not a macro.
I doubt a #define would suffice.
one fundamental difference being that clisp (i keep writing this as 'clips', talk about lisp !) does not suffer from precision issues, f.e. writing this
402387260077093773543702433923003985719374864210714632543799910429938512398629020592044208486969404800479988610197196058631666872994808558901323829669944590997424504087073759918823627727188732519779505950995276120874975462497043601418278094646496291056393887437886487337119181045825783647849977012476632889835955735432513185323958463075557409114262417474349347553428646576611667797396668820291207379143853719588249808126867838374559731746136085379534524221586593201928090878297308431392844403281231558611036976801357304216168747609675871348312025478589320767169132448426236131412508780208000261683151027341827977704784635868170164365024153691398281264810213092761244896359928705114964975419909342221566832572080821333186116811553615836546984046708975602900950537616475847728421889679646244945160765353408198901385442487984959953319101723355556602139450399736280750137837615307127761926849034352625200015888535147331611702103968175921510907788019393178114194545257223865541461062892187960223838971476088506276862967146674697562911234082439208160153780889893964518263243671616762179168909779911903754031274622289988005195444414282012187361745992642956581746628302955570299024324153181617210465832036786906117260158783520751516284225540265170483304226143974286933061690897968482590125458327168226458066526769958652682272807075781391858178889652208164348344825993266043367660176999612831860788386150279465955131156552036093988180612138558600301435694527224206344631797460594682573103790084024432438465657245014402821885252470935190620929023136493273497565513958720559654228749774011413346962715422845862377387538230483865688976461927383814900140767310446640259899490222221765904339901886018566526485061799702356193897017860040811889729918311021171229845901641921068884387121855646124960798722908519296819372388642614839657382291123125024186649353143970137428531926649875337218940694281434118520158014123344828015051399694290153483077644569099073152433278288269864602789864321139083506217095002597389863554277196742822248757586765752344220207573630569498825087968928162753848863396909959826280956121450994871701244516461260379029309120889086942028510640182154399457156805941872748998094254742173582401063677404595741785160829230135358081840096996372524230560855903700624271243416909004153690105933983835777939410970027753472000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 works as expected.for the c++ version i don't think even `unsigned long long` can accommodate results from larger than approx. 21! or thereabouts.
That is when you would take advantage of C++ type system and use a class implementing multi-precision numbers and operator overloading.
Take advantage of, or be forced to use :)
clisp is an unfortunately named specific implementation of common lisp, like sbcl ecl clasp ccl abcl etc.
looks like he is using GNU CLISP.
Common misconception of non Lispers that macros are equivalent to compile time programming. You’re not simply moving the evaluation to compile time, you’re moving it upwards outside the program space into a higher dimension of programmability.
Not to dog on C++ unfairly, CTE is pretty neat after all.
Funnily enough, PGs “On Lisp” has some really neat macros in it that demonstrate capabilities that just can’t be replicated with template based macros, iirc.
"outside the program space into a higher dimension of programmability"
I can visualize this metaphor just fine, but I can't tell why it's useful. Can you make this concept more concrete?
it's "meta" programming, since Lisp macros do source transformations: code to new code. So one writes code which will rewrite code. For example some programming language lacks a control structure one would want. Instead of waiting for the benevolent dictator for life implementing this feature, one can do it by creating a macro, which implements the control structure.
For example, imagine that you want to program with state machines and you need a short notation for that in your programming language. In Lisp you could design a syntax for a state machine and the Lisp macro would transform state machine descriptions into the code used to implement them -> the generated code typically will be longer and full of implementation details -> in the state machine description one would only specify what's necessary. The Lisp macro will do the code transformation from using the new control structure to the implementation of the control structure.
Thus one can view Lisp as a programmable programming language.
Templates are not nearly as powerful as lisp macros and stitching together code is 100 time more complex (bordering on Turing tarpit territory), but almost arbitrary compile time code generation is still possible. See boost.lambda, boost.spirit, eigen or anything using expression templates for example.
Aside the complexity, the main thing C++ currently lacks is code introspection to modify existing code without using an ad-hoc DSL. I think that even the reflection proposals do not go as far as actually introspecting code.
This is probably not the whole picture, and I have a very Rust-centric view of this, but I'll take a stab at it.
The correct analogue for Lisp macros is not C++ templates, but the C preprocessor itself. Specifically, a Lisp macro gets to take a particular section of code and change it as it wishes, with everything already conveniently tokenized for the programmer's convenience. Imagine if you could just write your own C preprocessor as part of your program and have the compiler automatically execute it on specific program areas that want your preprocessing.
Rust macros work similarly to this, the main difference being your syntax needs to be tokenizable as Rust instead of Lisp. But they're also rather powerful. So, for example, in Rust you only have one object system which has structs, traits, and very limited higher-kindedness[0]. But there's plenty of other object systems Rust would like to interop with: Objective-C, Swift, COM, and C++ among others.
The canonical way of doing this in Rust is to write a macro[1] that takes your class definition and converts it into a series of structs, traits, and/or function pointers that suitably interop with the foreign code. Code outside the macro then can reference the class created by the system.
If you don't have macros, your other options are:
- Metaclasses, which are the canonical way in Python of doing foreign object interfaces, though with an added wrinkle: multiple inheritance from classes of different metaclasses requires writing a combined metaclass that does both. In macros you usually just can't mix them like that, though I doubt you'd need to define a single class accessible from, say, both Objective-C and Windows COM.
- Write your own damned preprocessor. This is what Qt did with MOC (metaobject compiler) to get signals and slots[2]. If C++ had macros, Trolltech probably would have written MOC as a macro instead of a separate build step with a separate C++ tokenizer.
[0] A concept which I will not be explaining in this post, but it has to do with things like generic associated types which were needed for lifetime bounds on async traits
[1] Usually a "procedural macro", which is different from the pattern-matching macros Rust usually teaches in ways that don't matter here
[2] https://en.wikipedia.org/wiki/Signals_and_slots
Let’s say, you’re writing a web application. In most programming language you’d be using libraries or rely on a framework. With Lisp macros, you can program the archetype of a web application. And then use a simpler language to describe you application. Think of it as programmable snippets. Something like Ultisnips [0], but inherent to the language.
Think how much common code can exist in a software but cannot be refactored to a functions because it will have too many variables. Or the multiple problems with classes tree and overloading. Macros let you solve that.
[0]: https://github.com/SirVer/ultisnips
>Think how much common code can exist in a software but cannot be refactored to a functions because it will have too many variables.
Interesting...can you give an example of a macro that solves this refactoring problem?
One example is the ~use-package~ macro (Emacs plugin) [0]. Using packages in emacs is mostly the same code over and over. They've already been abstracted in functions, but you still find yourself juggling with so many utilities. You could write a bigger functions, but it will then have a lot of conditional branches. This macro selectively select the code it needs and transforming it if needs be and then the result will be evaluated.
It's a bit hard to explain for me (English is not my native language). But it's the difference between coding a solution with all the edge cases baked in and coding an archetype that let you add your own cases. With functions, you abstract common algorithms, with macros you abstract common architecture.
[0] https://github.com/jwiegley/use-package/blob/a6e856418d2ebd0...
> cannot be refactored to a functions because it will have too many variables.
I'm skeptical of this. Sounds ugly.
Meaning create a domain specific language for that web app.
I think an example would be helpful, conditionals:
Within the "program dimension" there is just no way to run code conditionally without an if, no matter how much you move left and right, you are constrained. It is only possible by using the "higher dimension".
"move left and right"? This meant nothing to me. 'higher dimension' Still pretty abstract.
This is different from constexpr if in C++? Where one branch will not exist in the compiled code?
Lisp macros take zero [0] or more unevaluated forms, does something with them (which may be computing a value, see the factorial example elsewhere in this discussion and the submitted blog), and then returns a new form (which could be that computed value, again see the factorial example) which the Lisp system evaluates eventually.
One way to see this would be to do something like:
https://tio.run/##S87JLC74/18jJTUtNzG5KF@hoCgzr0S3LDGnNFVBI1...The value bound to a inside the macro is the unevaluated form (the first and third items printed out). In this case the macro itself is just the identity macro other than its print effect so it returns the original form, which is why we get 1 and 3 printed as well.
Here's an example of processing (very primitive) the unevaluated form to produce something new:
Now we can get fancy (this is primitive still, but works):https://tio.run/##fZDBDoMgEETvfsXculh7qP2hImJKSpUgSf17uiqJ1V...
This generates a valid (prefix notation) lisp form from a more traditional infix form from mathematics. If I were being more clever I wouldn't use the second item in the list (the operation) directly but restrict it to valid arithmetic operators and change how I walk the structure. That would remove the need to force explicit parentheses since I could add in a proper parsing step. This is a very primitive version of what loop and other complex macros do. They take essentially a different language, parse it, and emit a valid Lisp form which is then evaluated.You could use this to get constexpr like behavior but once you do that you run into problems, you can't do this for example:
It only partially works because it only works, when a and b are both numbers. If they're symbols (second case) or other forms (first case) then the macro attempts to compute something that cannot be computed. You can fix the first case by doing: But that still leaves the second case erroring out. You could do something like what I did with infix which walks the forms and determines if they can be evaluated (no unbound variables) and then evaluate them conditionally, leaving expressions with unbound symbols intact to be processed later.So C++ constexprs are less than Lisp macros, but if you want to use Lisp macros to do the same thing as constexprs you have to do more work. Check out On Lisp and Let Over Lambda for two books that go deep into macros in Common Lisp.
[0] I honestly don't know why you'd want to do this, but technically it's valid to do:
I cannot think of a case where this would be useful, but someone else might think of one.You can give the language new features, using the features that the language provides.
Thus, this is okay:
LABELS defines local recursive functions. The macro returns the result of calling FACT, which is a number and which is a valid form in Common Lisp. A number evaluates to itself.Agreed with the technical content and conclusion. However, I think it is worth pointing out that since C++11, it has had a mechanism to specify (maybe) compile-time computations that are written in plain C++: constexpr, https://en.cppreference.com/w/cpp/language/constexpr
(My parenthetical "maybe" is that I don't think compilers have to compute constexpr expressions at compile time. The compiler will be forced to when such expressions are used in contexts that require values at compile time. But I think it would be permissible for a compile to defer computation of a constexpr to runtime if the value isn't needed until runtime.)
Note that C++20 introduced consteval as a means to enforce compile time computation. See https://en.cppreference.com/w/cpp/language/consteval
Besides consteval and constinit, you can force evaluation with static constexpr.
Another post comparing C and lisp Macros: http://lists.warhead.org.uk/pipermail/iwe/2005-July/000130.h...
HN discussion: https://news.ycombinator.com/item?id=31199992
Lisp macros can take arbitrary parameters and are written in lisp.
C++ macros can only take types and numbers (until variadic), and writing any code to operate on those inputs is challenging.
https://github.com/knome/metabrainfuck/blob/master/bf.cpp
It's not too bad :)
As a code golfer, this is beyond satisfying. Doing things in a few bytes is nice, yes, but doing them in a way no reasonable mortal ever would is even better.
There is the lisp meme “mom can we have defmacro? No we have defmacro at home”
The post is actually about template metaprogramming. 'template macroprogramming' isn't really a thing.
Yes, the purpose of C++ templates (not macros, macros are C preprocessor) is to implement generic classes and functions, not to implement arbitrary computation, and (iirc) it was only discovered after their publication that they are in fact Turing-complete.
C++ templates can also take other C++ templates (the template itself, not just an instantiation), enum values and I think in C++23 maybe even structs but I need to check
C++20 added the ability to take class instances as non-type parameters but the rules around the types of classes that can be used in this way are pretty restrictive (for good reason).
It’s clumsy because C++ templates are an implementation of generic and dependent types, while C++ constexpr and consteval functions are for arbitrary computation. The fact that template metaprogramming _can_ be used for arbitrary computation is actually an unhappy accident of its design that was only proven after its publication.
Does this article have a (2002) that I missed or something?
I hate to agree. For an article about metaprogramming in C++, you'd expect expert level C++ code, and not what's presented here. It's bad. And the first example with the list even has a bug.
Macaroni art versus the Mona Lisa.
The "next" field in the lead code sample is supposed to be a pointer, right?
I don’t have any experience with Lisp. But I think C++ templates and Rust macros are both super bad and impoverished compared to what can be done in Jai.
https://www.forrestthewoods.com/blog/using-jais-unique-and-p...
Except Jai is never going to get the use any of them have, so we use what is there.
That is not a coherent sentence.
I share it not to say “use Jai instead of C++/Rust”. But to instead say “templates and macros suck and it’d be great if Rust or other language copied good ideas into their ecosystem”.
First we need to be able to use Jai, so that those folks can validate in practice how it works, not watching videos about someone using their creation.
Someone has to lead by example. Jai is trying things no one else is doing. No reason that Rust or other more widely available regions can’t lead.
Rust certainly led the charge with the borrow checker. Why not do better than macros and templates and generics?
The `#modify` thing looks pretty cool, but I can't help but think how a compiler/ide is supposed to analyze the body of such function and provide suggestions. Rust macros have a similar issue, but it's offsetted by the fact that generics are pretty powerful too and can be fully analyzed by the compiler.
Pffft, I’d just #. that factorial call and it’d be a normal defun.