-
-
Notifications
You must be signed in to change notification settings - Fork 33
Description
Is your feature request related to a problem? Please describe.
Now, to create a DSL with Spark, you need to manually manipulate Entity and Section structs.
Describe the solution you'd like
I propose a builder API that allows the specification of all the information that a DSL needs procedurally, leaning heavily on the pipe operator.
Describe alternatives you've considered
A meta DSL would be another way to do this.
As @jimsynz and I discussed in #47, a new API gives us the opportunity to improve the usability and understandability of Spark. Including radical changes that don't need to be backwards compatible, since the API is new.
A Builder API will be easier to incrementally design and can serve as the backbone of what we do with the "Code as Data" from the meta-DSL.
Express the feature either with a change to resource syntax, or with a change to the resource interface
For example
entity = Spark.Builder.Entity.new(:my_entity, MyEntity)
|> Spark.Builder.Entity.field(Spark.Builder.Field.new(...))
|> Spark.Builder.Entity.field(...)
|> Spark.Builder.Entity.subentity(...)
...
With builder API modules and functions for all of the data needed for a complete DSL.