• Что бы вступить в ряды "Принятый кодер" Вам нужно:
    Написать 10 полезных сообщений или тем и Получить 10 симпатий.
    Для того кто не хочет терять время,может пожертвовать средства для поддержки сервеса, и вступить в ряды VIP на месяц, дополнительная информация в лс.

  • Пользаватели которые будут спамить, уходят в бан без предупреждения. Спам сообщения определяется администрацией и модератором.

  • Гость, Что бы Вы хотели увидеть на нашем Форуме? Изложить свои идеи и пожелания по улучшению форума Вы можете поделиться с нами здесь. ----> Перейдите сюда
  • Все пользователи не прошедшие проверку электронной почты будут заблокированы. Все вопросы с разблокировкой обращайтесь по адресу электронной почте : info@guardianelinks.com . Не пришло сообщение о проверке или о сбросе также сообщите нам.

Generating swaggers at compile time

Lomanu4 Оффлайн

Lomanu4

Команда форума
Администратор
Регистрация
1 Мар 2015
Сообщения
1,481
Баллы
155
At work, we've been generating code from swaggers and publishing said generated code. Alas, this requires us to remember to generate the swagger(s), as well as fixing versions of libraries (which means you need to upgrade versions in the server, publish updated generated code, then update your client), see this examples from

Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.

:


import com.github.eikek.sbt.openapi._

val CirceVersion = "0.14.1"
libraryDependencies ++= Seq(
"io.circe" %% "circe-generic" % CirceVersion,
"io.circe" %% "circe-parser" % CirceVersion
)

openapiSpec := (Compile/resourceDirectory).value/"test.yml"
openapiTargetLanguage := Language.Scala
Compile/openapiScalaConfig := ScalaConfig()
.withJson(ScalaJson.circeSemiauto)
.addMapping(CustomMapping.forType({ case TypeDef("LocalDateTime", _) =>
TypeDef("Timestamp", Imports("com.mypackage.Timestamp"))
}))
.addMapping(CustomMapping.forName({ case s => s + "Dto" }))

enablePlugins(OpenApiSchema)
The problem


When defining endpoints, it's not uncommon to couple the endpoint and its implementation with a .serverLogic, meaning creating our endpoints relied on the implementation being available (need to instantiate the API, repositories to connect to databases, http clients...).

This made it harder than necessary to write our swaggers to disk:


For the swagger to be generated and written to disk. Otherwise, generated code wouldn't be up to date, which is a problem. A very long procedure, not very convenient, that we used to do once in a while.

Simple solution


We can generate swaggers using libraries like

Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.

:


val swagger = OpenAPIDocsInterpreter()
.toOpenAPI(
tapirEndpoints,
"my app",
"v0.0.1"
)
.openapi("3.0.3")
.toYaml

writeSwaggerToFile(swagger, pathToSwaggerFile)

Then we would just have to decouple our Endpoints and their implementation (which made them ServerEndpoints by the way, but you only need Endpoint to generate the OpenAPI spec).


// MyApiEndpoints.scala
class Endpoints {
protected val booksListingEndpoint: PublicEndpoint[(BooksQuery, Limit, AuthToken), String, List[Book], Any] =
endpoint
.get
.in(("books" / path[String]("genre") / path[Int]("year")).mapTo[BooksQuery])
.in(query[Limit]("limit").description("Maximum number of books to retrieve"))
.in(header[AuthToken]("X-Auth-Token"))
.errorOut(stringBody)
.out(jsonBody[List[Book]])

val plainEndpoints: List[AnyEndpoint] = List(booksListingEndpoint)
}

// MyApiRoutes.scala
class Routes(api: MyApiService) extends Endpoints {
private val booksListing =
booksListingEndpoint
.serverLogic { case (booksQuery, limit, authToken) =>
api.listBooks(authToken, booksQuery, limit)
}

val endpoints: List[ServerEndpoint[Fs2Streams[IO], IO]] = List(
booksListing
)
val routes: HttpRoutes = Http4sServerInterpreter()
.toRoutes(endpoints)
}

And then find a way to compile and run code calling OpenAPIDocsInterpreter with the new Endpoints().plainEndpoints. Easy right?

Let's over engineer it


Splitting endpoints and server endpoints implementation is the right and easy thing to do, but we can go further. What if we could leverage our build system to generate a swagger for us each time we compile?

Making a sbt plugin



Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.

is the de facto build tool in Scala, hence I made a plugin to do the generating for us. A plugin can define a task, which you can then call inside the sbt shell as a command.


import sbt.Keys.*
import sbt.*

object SpecGen extends AutoPlugin {
object autoImport {
val Spec = config("spec").extend(Runtime)
val specGenMain = settingKey[String]("Main class (FQDN) to run")
val specGenArgs = settingKey[Seq[String]]("Arguments to pass to runner")
val specGenMake = taskKey[Unit]("run code/resource generation from config")
}

import autoImport.*

override def projectSettings: Seq[Def.Settings[?]] = inConfig(Spec)(Defaults.configSettings ++ Seq(
specGenMain := "<user defined>",
specGenArgs := Seq.empty,
specGenMake := {
val logger = streams.value.log
val classPath = Attributed.data((Spec / fullClasspath).value)
(Spec / runner).value.run(specGenMain.value, classPath, specGenArgs.value, logger).get
}
)) :+ (ivyConfigurations := overrideConfigs(Spec)(ivyConfigurations.value))
}
? Tip


This small implementation is the bare minimum to register a class, instantiate it, and run it via sbt.

To make our task run after compile we would have to make it return a File and add a resource generator: Compile / resourceGenerators += (Spec / specGenMake).taskValue.
This simple plugin can be enabled on projects as follows:


val `my-api` = (project in file("application/my-api"))
.settings(...)
.enablePlugins(SpecGen)
.settings(
Spec / specGenMain := "org.myapi.GenerateSwagger"
)

// we need a project to be able to `publish` the swagger.
// GenerateSwagger would write to the <module>/src/resources/swagger.yaml
val `my-api-swagger` = project in file("modules/my-api-swagger")

And we just have to write a GenerateSwagger class that instanciate Endpoints and call OpenAPIDocsInterpreter on them, to write the resulting yaml to a file!

? Note


For now, this small implementation requires the user to run Spec / specGenMake;compile to have the swagger generated and code to be compiled, until I revisit it. For our needs it isn't a huge deal, as we generate and publish swaggers inside our CI/CD environement.
Generating Scala code from a published swagger


How would you generate code from a swagger inside a jar downloaded by sbt?

Using

Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.

, the swagger downloader and unpacker, it's quite easy! Add your swagger module as a dependency and let the plugin do the job:


// build.sbt
object Dependencies {
lazy val swagger = "com.example" % "my-api-swagger_2.13" % "1.0.0"
lazy val circe = Seq(
"io.circe" % "circe-core",
"io.circe" % "circe-generics",
"io.circe" % "circe-generic-extras"
).map(_ %% "0.14.1")
}

lazy val `my-api-generated` = (project in file("modules/my-api-generated"))
.enablePlugins(OpenApiSchema)
.enablePlugins(SwaggerinatorSbt)
.settings(swaggerinatorDependency := Dependencies.swagger)
.settings(swaggerinatorPackage := Pkg("com.example.my-api.generated"))
.settings(libraryDependencies ++= Dependencies.circe)

lazy val infrastructure = (project in file("modules/infrastructure"))
// ...
.dependsOn(`my-api-generated` % "compile->compile")

Passing the dependency to swaggerinator will let it access the swagger inside it, by unzipping the jar (after all, it's just a ZIP file with another extension). Then the plugin generates code from the swagger, by calling the OpenApiSchema plugin for us.

Originally from

Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.




Пожалуйста Авторизируйтесь или Зарегистрируйтесь для просмотра скрытого текста.

 
Вверх Снизу